Apr 21 00:00:30.684886 ip-10-0-143-115 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 00:00:30.684898 ip-10-0-143-115 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 00:00:30.684905 ip-10-0-143-115 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 00:00:30.685143 ip-10-0-143-115 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 00:00:40.910259 ip-10-0-143-115 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 00:00:40.910275 ip-10-0-143-115 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 968e62fcb2e94bbeb333e6ea48c7a623 -- Apr 21 00:02:58.263359 ip-10-0-143-115 systemd[1]: Starting Kubernetes Kubelet... Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.742581 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747798 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747812 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747816 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747819 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747822 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747825 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747828 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 00:02:58.807972 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747831 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747834 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747837 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747841 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747848 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747852 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747855 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747858 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747860 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747863 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747866 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747869 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747872 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747874 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747877 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747879 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747882 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747885 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747887 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747890 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 00:02:58.808974 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747892 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747895 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747897 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747900 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747903 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747905 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747909 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747913 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747916 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747920 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747923 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747926 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747929 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747932 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747935 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747938 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747940 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747943 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747946 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 00:02:58.809552 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747948 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747951 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747953 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747956 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747959 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747962 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747964 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747967 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747970 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747974 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747977 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747982 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747986 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747990 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747993 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747996 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.747999 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748001 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748004 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748007 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 00:02:58.810242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748011 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748013 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748016 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748019 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748021 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748024 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748026 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748030 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748032 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748036 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748038 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748041 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748043 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748046 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748048 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748051 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748053 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748056 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748059 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748061 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 00:02:58.810829 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748462 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748469 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748473 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748475 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748478 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748482 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748484 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748487 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748490 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748492 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748495 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748498 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748502 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748504 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748507 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748510 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748515 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748518 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748521 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748523 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 00:02:58.811523 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748526 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748529 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748531 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748534 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748537 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748540 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748542 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748546 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748550 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748553 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748555 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748558 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748561 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748563 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748565 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748568 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748571 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748574 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748576 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 00:02:58.812320 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748579 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748581 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748583 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748586 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748589 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748592 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748595 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748597 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748600 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748604 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748606 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748609 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748612 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748614 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748617 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748619 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748622 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748624 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748627 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748629 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 00:02:58.814530 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748631 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748634 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748638 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748641 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748644 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748647 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748650 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748652 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748655 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748657 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748660 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748662 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748665 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748667 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748670 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748672 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748676 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748679 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748681 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 00:02:58.815387 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748684 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748687 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748691 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748693 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748696 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748698 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748701 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.748703 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749833 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749846 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749854 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749858 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749863 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749866 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749871 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749876 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749880 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749883 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749886 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749890 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749893 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749896 2571 flags.go:64] FLAG: --cgroup-root="" Apr 21 00:02:58.817637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749899 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749902 2571 flags.go:64] FLAG: --client-ca-file="" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749905 2571 flags.go:64] FLAG: --cloud-config="" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749908 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749911 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749916 2571 flags.go:64] FLAG: --cluster-domain="" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749919 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749922 2571 flags.go:64] FLAG: --config-dir="" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749925 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749929 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749933 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749937 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749942 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749946 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749949 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749952 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749955 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749958 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749961 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749966 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749969 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749972 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749975 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749978 2571 flags.go:64] FLAG: --enable-server="true" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749980 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 00:02:58.818729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749985 2571 flags.go:64] FLAG: --event-burst="100" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749988 2571 flags.go:64] FLAG: --event-qps="50" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749991 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749994 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.749997 2571 flags.go:64] FLAG: --eviction-hard="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750001 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750004 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750007 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750010 2571 flags.go:64] FLAG: --eviction-soft="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750013 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750016 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750019 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750022 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750025 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750028 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750031 2571 flags.go:64] FLAG: --feature-gates="" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750039 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750042 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750045 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750050 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750054 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750057 2571 flags.go:64] FLAG: --help="false" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750060 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-143-115.ec2.internal" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750063 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 00:02:58.819757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750066 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750069 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750073 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750077 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750079 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750082 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750085 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750088 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750105 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750108 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750111 2571 flags.go:64] FLAG: --kube-reserved="" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750114 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750117 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750120 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750123 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750126 2571 flags.go:64] FLAG: --lock-file="" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750129 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750132 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750135 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750140 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750143 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750146 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750149 2571 flags.go:64] FLAG: --logging-format="text" Apr 21 00:02:58.820537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750152 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750157 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750160 2571 flags.go:64] FLAG: --manifest-url="" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750163 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750169 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750172 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750177 2571 flags.go:64] FLAG: --max-pods="110" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750180 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750183 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750185 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750189 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750192 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750194 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750197 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750205 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750208 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750211 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750214 2571 flags.go:64] FLAG: --pod-cidr="" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750217 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750223 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750226 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750229 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750232 2571 flags.go:64] FLAG: --port="10250" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750235 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 00:02:59.075636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750238 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f359889dd69dc182" Apr 21 00:02:58.857036 ip-10-0-143-115 systemd[1]: Started Kubernetes Kubelet. Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750241 2571 flags.go:64] FLAG: --qos-reserved="" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750244 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750247 2571 flags.go:64] FLAG: --register-node="true" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750250 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750253 2571 flags.go:64] FLAG: --register-with-taints="" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750256 2571 flags.go:64] FLAG: --registry-burst="10" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750259 2571 flags.go:64] FLAG: --registry-qps="5" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750262 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750265 2571 flags.go:64] FLAG: --reserved-memory="" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750270 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750273 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750277 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750280 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750283 2571 flags.go:64] FLAG: --runonce="false" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750286 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750289 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750293 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750295 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750298 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750301 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750304 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750307 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750310 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750313 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750316 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 00:02:59.092136 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750319 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750322 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750325 2571 flags.go:64] FLAG: --system-cgroups="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750328 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750333 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750336 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750339 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750343 2571 flags.go:64] FLAG: --tls-min-version="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750347 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750350 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750352 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750355 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750358 2571 flags.go:64] FLAG: --v="2" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750363 2571 flags.go:64] FLAG: --version="false" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750367 2571 flags.go:64] FLAG: --vmodule="" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750371 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.750376 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750470 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750476 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750480 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750484 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750490 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750493 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 00:02:59.093319 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750496 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750498 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750501 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750504 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750507 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750509 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750512 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750515 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750517 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750520 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750523 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750526 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750529 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750531 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750534 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750536 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750539 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750542 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750544 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 00:02:59.094155 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750547 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750550 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750552 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750555 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750558 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750560 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750564 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750567 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750570 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750573 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750576 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750578 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750581 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750584 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750586 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750589 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750591 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750594 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750596 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750599 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 00:02:59.095042 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750601 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750604 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750607 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750609 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750612 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750614 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750617 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750620 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750622 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750625 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750628 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750630 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750633 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750635 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750638 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750640 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750643 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750645 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750649 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 00:02:59.095729 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750652 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750655 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750658 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750662 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750666 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750669 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750672 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750674 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750677 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750679 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750682 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750685 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750687 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750690 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750692 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750695 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750697 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750700 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750702 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750704 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 00:02:59.096605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750707 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.750710 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.751846 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.759695 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.759715 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759778 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759786 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759791 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759796 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759800 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759804 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759808 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759813 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759817 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759821 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759825 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 00:02:59.097281 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759829 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759833 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759837 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759841 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759846 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759850 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759855 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759862 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759866 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759870 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759875 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759879 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759883 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759887 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759891 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759895 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759899 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759903 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759907 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 00:02:59.098120 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759911 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759924 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759929 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759933 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759937 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759941 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759945 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759948 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759952 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759957 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759960 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759964 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759968 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759972 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759976 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759979 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759985 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759990 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759994 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 00:02:59.098832 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.759999 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760003 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760007 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760011 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760014 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760018 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760022 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760025 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760029 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760033 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760036 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760040 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760044 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760047 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760051 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760056 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760059 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760063 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760070 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760074 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 00:02:59.099591 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760078 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760082 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760086 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760104 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760109 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760112 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760116 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760120 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760124 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760128 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760131 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760137 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760141 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760145 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760149 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760153 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 00:02:59.100241 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760157 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.760164 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760299 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760306 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760311 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760315 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760320 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760324 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760328 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760333 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760337 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760340 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760345 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760348 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760352 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760357 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 00:02:59.421873 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760361 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760365 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760369 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760372 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760376 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760380 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760383 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760387 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760391 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760394 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760398 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760403 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760406 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760411 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760414 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760418 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760421 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760425 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760429 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760432 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 00:02:59.423178 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760436 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760440 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760444 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760448 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760451 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760454 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760458 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760461 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760465 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760469 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760472 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760478 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760481 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760485 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760489 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760492 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760495 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760499 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760505 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760510 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 00:02:59.424813 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760514 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760519 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760523 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760528 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760532 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760536 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760540 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760543 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760548 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760552 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760555 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760559 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760563 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760566 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760570 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760574 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760577 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760581 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760584 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 00:02:59.425424 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760588 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760592 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760596 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760600 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760607 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760610 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760614 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760618 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760621 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760625 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760629 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760633 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:02:58.760636 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.760643 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.761403 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 00:02:59.426249 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.763373 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.764324 2571 server.go:1019] "Starting client certificate rotation" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.764418 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.764456 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.794667 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.797158 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.819408 2571 log.go:25] "Validated CRI v1 runtime API" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.822583 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.825137 2571 log.go:25] "Validated CRI v1 image API" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.826744 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.830636 2571 fs.go:135] Filesystem UUIDs: map[60e11d6c-2181-4737-b9ef-54831823f012:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 cd696b89-26bd-4afd-b75c-cb24fc6bb49d:/dev/nvme0n1p4] Apr 21 00:02:59.426718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.830651 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.836545 2571 manager.go:217] Machine: {Timestamp:2026-04-21 00:02:58.834175447 +0000 UTC m=+0.447546959 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3090545 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec259bd1b1a2bf64381e5b9a3c351db9 SystemUUID:ec259bd1-b1a2-bf64-381e-5b9a3c351db9 BootID:968e62fc-b2e9-4bbe-b333-e6ea48c7a623 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:2a:30:36:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:2a:30:36:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:48:85:bf:d3:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.837256 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.837335 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.839318 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.839515 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-115.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.839688 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.839698 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.839711 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.840575 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.842126 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.842232 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.844826 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.844837 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.844854 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.844864 2571 kubelet.go:397] "Adding apiserver pod source" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.844872 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.845841 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 00:02:59.427449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.845855 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.848857 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.850654 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851693 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851706 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851712 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851717 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851722 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851728 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851734 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851740 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851747 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851753 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851761 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.851770 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.852631 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.852639 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.856113 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.856144 2571 server.go:1295] "Started kubelet" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.856235 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.856242 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.856315 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.857800 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-115.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.858580 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.858581 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.859005 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.859745 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.863378 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6nqvv" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.863521 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-115.ec2.internal.18a836520293d63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-115.ec2.internal,UID:ip-10-0-143-115.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-115.ec2.internal,},FirstTimestamp:2026-04-21 00:02:58.856121919 +0000 UTC m=+0.469493431,LastTimestamp:2026-04-21 00:02:58.856121919 +0000 UTC m=+0.469493431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-115.ec2.internal,}" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.866466 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867084 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867683 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867702 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867683 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867854 2571 factory.go:55] Registering systemd factory Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867870 2571 factory.go:223] Registration of the systemd container factory successfully Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867871 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.867904 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.868145 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.869685 2571 factory.go:153] Registering CRI-O factory Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.869703 2571 factory.go:223] Registration of the crio container factory successfully Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.869751 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 00:02:59.428023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.869772 2571 factory.go:103] Registering Raw factory Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.869792 2571 manager.go:1196] Started watching for new ooms in manager Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.870084 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.870313 2571 manager.go:319] Starting recovery of all containers Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.870342 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6nqvv" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.873597 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.879872 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.889020 2571 manager.go:324] Recovery completed Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.893373 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.895677 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.895702 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.895711 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.896134 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.896142 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.896159 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.898335 2571 policy_none.go:49] "None policy: Start" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.898347 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.898357 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.951632 2571 manager.go:341] "Starting Device Plugin manager" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.951663 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.951672 2571 server.go:85] "Starting device plugin registration server" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.951893 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.951933 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.951994 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.952077 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.952087 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.952551 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.952583 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.983696 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.984869 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.984893 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.984909 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.984916 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:58.984962 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:58.987585 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.052614 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.053520 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.053546 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.053555 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.053585 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.059482 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.059501 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-115.ec2.internal\": node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.085226 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal"] Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.085288 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.429202 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.085907 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.086613 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.086678 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.086689 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.088262 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.088409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.088446 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.088958 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.088989 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.089006 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.089033 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.089043 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.089015 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.091191 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.091213 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.091820 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientMemory" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.091851 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.091861 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeHasSufficientPID" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.114665 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-115.ec2.internal\" not found" node="ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.118973 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-115.ec2.internal\" not found" node="ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.169251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1cf2d95e85fe20270acb11fb3142e37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-115.ec2.internal\" (UID: \"b1cf2d95e85fe20270acb11fb3142e37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.169273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.169292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.186630 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1cf2d95e85fe20270acb11fb3142e37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-115.ec2.internal\" (UID: \"b1cf2d95e85fe20270acb11fb3142e37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.430469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1cf2d95e85fe20270acb11fb3142e37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-115.ec2.internal\" (UID: \"b1cf2d95e85fe20270acb11fb3142e37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.269728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb7e593db137b9f51b2a7a346953aea0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal\" (UID: \"eb7e593db137b9f51b2a7a346953aea0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.287674 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.388400 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.416566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.431235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.421083 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:02:59.743334 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.488976 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.743334 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.589410 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.743334 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.689911 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:02:59.743334 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.697932 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.763923 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.764034 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.764065 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.790822 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.866577 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.872500 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 23:57:58 +0000 UTC" deadline="2027-12-20 08:47:50.646917154 +0000 UTC" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.872521 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14600h44m50.774399249s" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.879955 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.890868 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.907821 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8jrmc" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:02:59.915840 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8jrmc" Apr 21 00:03:00.040582 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:02:59.990933 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-115.ec2.internal\" not found" Apr 21 00:03:00.082356 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.082324 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:03:00.115697 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:00.115662 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1cf2d95e85fe20270acb11fb3142e37.slice/crio-cc8cd87088d32fa05598eeb1f9b4166c532a4216179d83eb4bf21e6f95a9682f WatchSource:0}: Error finding container cc8cd87088d32fa05598eeb1f9b4166c532a4216179d83eb4bf21e6f95a9682f: Status 404 returned error can't find the container with id cc8cd87088d32fa05598eeb1f9b4166c532a4216179d83eb4bf21e6f95a9682f Apr 21 00:03:00.121368 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:00.121346 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb7e593db137b9f51b2a7a346953aea0.slice/crio-b3330e075257ac9085af6340cc7b29efd31bc9e1a7b191444eeeb10f27fb18ff WatchSource:0}: Error finding container b3330e075257ac9085af6340cc7b29efd31bc9e1a7b191444eeeb10f27fb18ff: Status 404 returned error can't find the container with id b3330e075257ac9085af6340cc7b29efd31bc9e1a7b191444eeeb10f27fb18ff Apr 21 00:03:00.123174 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.123150 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 00:03:00.167773 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.167740 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" Apr 21 00:03:00.181146 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.181120 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 00:03:00.181893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.181880 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" Apr 21 00:03:00.187752 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.187735 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 00:03:00.216296 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.216261 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:03:00.846634 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.846410 2571 apiserver.go:52] "Watching apiserver" Apr 21 00:03:00.854332 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.854297 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 00:03:00.854811 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.854783 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6fz9j","openshift-network-diagnostics/network-check-target-s2mw9","openshift-network-operator/iptables-alerter-xkz4p","kube-system/konnectivity-agent-zsbmm","kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd","openshift-cluster-node-tuning-operator/tuned-xxv48","openshift-image-registry/node-ca-9hvww","openshift-multus/multus-hvbch","openshift-ovn-kubernetes/ovnkube-node-2nrh8","openshift-dns/node-resolver-gwszj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal","openshift-multus/multus-additional-cni-plugins-mghn5"] Apr 21 00:03:00.858057 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.857864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.858057 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.857906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:00.858057 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.857977 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:00.858988 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.858958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.860448 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.860426 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.860630 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.860616 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-c7dbj\"" Apr 21 00:03:00.860724 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.860687 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.861575 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.861552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.861860 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.861840 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 00:03:00.862519 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.862227 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.862519 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.862279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.862519 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.862310 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-p6d6z\"" Apr 21 00:03:00.863732 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.862974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.864676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.864353 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 00:03:00.864676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.864416 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.864676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.864591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sprp8\"" Apr 21 00:03:00.864878 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.864773 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 00:03:00.865291 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.865132 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 00:03:00.866771 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.865477 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zt6jn\"" Apr 21 00:03:00.866771 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.865705 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.866771 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.865929 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.866771 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.866031 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.866771 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.866497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.867145 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.866840 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jsxbx\"" Apr 21 00:03:00.867145 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.866859 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 00:03:00.867256 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.867085 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.868165 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.868145 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sgq7g\"" Apr 21 00:03:00.868254 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.868180 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.868582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.868415 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 00:03:00.868582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.868449 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 00:03:00.869151 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.869123 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.869858 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.869837 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.873506 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.872284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874123 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tljr2\"" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874233 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874164 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.874300 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:00.874393 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874364 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.874735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874480 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.874735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.874655 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 00:03:00.875685 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.875653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.875762 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.875659 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:00.877604 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877583 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/728512b6-8990-45f9-b0fa-89772f9c1362-konnectivity-ca\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.877705 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877622 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gt5x\" (UniqueName: \"kubernetes.io/projected/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-kube-api-access-2gt5x\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.877705 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:00.877813 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877715 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-sys-fs\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.877813 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-cnibin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.877813 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-netd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.877813 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-script-lib\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877847 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysconfig\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-netns\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877911 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-netns\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-var-lib-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-run\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.877987 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-host\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-etc-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-modprobe-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-kubernetes\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878126 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qfhfx\"" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-daemon-config\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878161 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-kubelet\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878185 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbm2\" (UniqueName: \"kubernetes.io/projected/fe295176-34bd-4593-a32e-cd3d077a6b0f-kube-api-access-khbm2\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-ovn\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-host\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-registration-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx6nn\" (UniqueName: \"kubernetes.io/projected/441c3a92-f5bd-446c-beb0-d70639bbb401-kube-api-access-nx6nn\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878309 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-os-release\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878309 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-multus-certs\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.878363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-slash\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-config\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-env-overrides\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878435 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dhzhp\"" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl867\" (UniqueName: \"kubernetes.io/projected/012b4bee-5b6f-4bec-9704-f110e7aba3eb-kube-api-access-hl867\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-systemd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-node-log\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-conf\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-bin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-systemd\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878624 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-var-lib-kubelet\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-tmp\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75287a4a-4503-4680-a428-655e61b86e85-host-slash\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-kubelet\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-conf-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-528gb\" (UniqueName: \"kubernetes.io/projected/b578ee7e-4063-4906-b849-ca0d856e3c15-kube-api-access-528gb\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-sys\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-socket-dir-parent\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovn-node-metrics-cert\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75287a4a-4503-4680-a428-655e61b86e85-iptables-alerter-script\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.878992 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/728512b6-8990-45f9-b0fa-89772f9c1362-agent-certs\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-socket-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-serviceca\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-cni-binary-copy\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-multus\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-etc-kubernetes\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkjk\" (UniqueName: \"kubernetes.io/projected/75287a4a-4503-4680-a428-655e61b86e85-kube-api-access-9wkjk\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-etc-selinux\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879323 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw86\" (UniqueName: \"kubernetes.io/projected/173d74c8-1f07-4764-a03f-8091e02dc212-kube-api-access-2gw86\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.879952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-systemd-units\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-hostroot\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-lib-modules\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879459 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-device-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-system-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-k8s-cni-cncf-io\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-log-socket\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879580 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-bin\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.880687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.879604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-tuned\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.917615 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.917578 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:57:59 +0000 UTC" deadline="2027-09-17 05:35:23.266902451 +0000 UTC" Apr 21 00:03:00.917761 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.917628 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12341h32m22.349278696s" Apr 21 00:03:00.969268 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.969234 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 00:03:00.980513 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.980665 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-bin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.980665 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980594 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.980665 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-systemd\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.980665 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-var-lib-kubelet\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-tmp\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75287a4a-4503-4680-a428-655e61b86e85-host-slash\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-kubelet\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-conf-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-528gb\" (UniqueName: \"kubernetes.io/projected/b578ee7e-4063-4906-b849-ca0d856e3c15-kube-api-access-528gb\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-var-lib-kubelet\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.980875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-systemd\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-bin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-sys\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75287a4a-4503-4680-a428-655e61b86e85-host-slash\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-sys\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-conf-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.980988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-os-release\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.981115 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.981212 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.981204 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:01.48116217 +0000 UTC m=+3.094533670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981234 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-kubelet\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981333 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-socket-dir-parent\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovn-node-metrics-cert\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981382 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-socket-dir-parent\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75287a4a-4503-4680-a428-655e61b86e85-iptables-alerter-script\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.981618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/728512b6-8990-45f9-b0fa-89772f9c1362-agent-certs\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-socket-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-socket-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-serviceca\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-cni-binary-copy\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.981981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.981925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-multus\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-var-lib-cni-multus\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-etc-kubernetes\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkjk\" (UniqueName: \"kubernetes.io/projected/75287a4a-4503-4680-a428-655e61b86e85-kube-api-access-9wkjk\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.982274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-etc-selinux\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.982274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw86\" (UniqueName: \"kubernetes.io/projected/173d74c8-1f07-4764-a03f-8091e02dc212-kube-api-access-2gw86\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-etc-selinux\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-systemd-units\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-hostroot\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-lib-modules\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.982492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-cni-binary-copy\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c972b\" (UniqueName: \"kubernetes.io/projected/918f7c2d-8780-4291-b141-5fb77d94b6cf-kube-api-access-c972b\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-cnibin\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-device-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-system-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-k8s-cni-cncf-io\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.982802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.982677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-lib-modules\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-k8s-cni-cncf-io\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-system-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-log-socket\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-log-socket\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-etc-kubernetes\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-systemd-units\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-hostroot\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.983981 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-device-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75287a4a-4503-4680-a428-655e61b86e85-iptables-alerter-script\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-bin\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-cni-dir\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-serviceca\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-bin\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.984975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-tuned\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/728512b6-8990-45f9-b0fa-89772f9c1362-konnectivity-ca\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.988426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovn-node-metrics-cert\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/728512b6-8990-45f9-b0fa-89772f9c1362-konnectivity-ca\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gt5x\" (UniqueName: \"kubernetes.io/projected/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-kube-api-access-2gt5x\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/728512b6-8990-45f9-b0fa-89772f9c1362-agent-certs\") pod \"konnectivity-agent-zsbmm\" (UID: \"728512b6-8990-45f9-b0fa-89772f9c1362\") " pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.985961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/918f7c2d-8780-4291-b141-5fb77d94b6cf-tmp-dir\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-system-cni-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-sys-fs\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-cnibin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-netd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-cnibin\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-sys-fs\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.986965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-cni-netd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-script-lib\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.989331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysconfig\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-netns\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-netns\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-var-lib-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-run\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987337 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/918f7c2d-8780-4291-b141-5fb77d94b6cf-hosts-file\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-tuned\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-run\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-host\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysconfig\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-etc-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-modprobe-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-netns\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-kubernetes\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-daemon-config\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-kubernetes\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-kubelet\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khbm2\" (UniqueName: \"kubernetes.io/projected/fe295176-34bd-4593-a32e-cd3d077a6b0f-kube-api-access-khbm2\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.987982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-ovn\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-host\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-registration-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nx6nn\" (UniqueName: \"kubernetes.io/projected/441c3a92-f5bd-446c-beb0-d70639bbb401-kube-api-access-nx6nn\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-os-release\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-script-lib\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-multus-certs\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988486 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-netns\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-slash\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-config\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-host-run-multus-certs\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-env-overrides\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl867\" (UniqueName: \"kubernetes.io/projected/012b4bee-5b6f-4bec-9704-f110e7aba3eb-kube-api-access-hl867\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.988659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441c3a92-f5bd-446c-beb0-d70639bbb401-registration-dir\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-kubelet\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.990888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b578ee7e-4063-4906-b849-ca0d856e3c15-os-release\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-systemd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4j9\" (UniqueName: \"kubernetes.io/projected/f398c142-f284-48f9-b608-6eb7229425ae-kube-api-access-7p4j9\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989225 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-modprobe-d\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-node-log\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-slash\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-conf\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-host-run-ovn-kubernetes\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b578ee7e-4063-4906-b849-ca0d856e3c15-multus-daemon-config\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.989723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-etc-sysctl-conf\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-host\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-systemd\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990082 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-run-ovn\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-var-lib-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe295176-34bd-4593-a32e-cd3d077a6b0f-host\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-etc-openvswitch\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-env-overrides\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.991684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.990694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012b4bee-5b6f-4bec-9704-f110e7aba3eb-node-log\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.992519 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.991188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012b4bee-5b6f-4bec-9704-f110e7aba3eb-ovnkube-config\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:00.992519 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.992370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe295176-34bd-4593-a32e-cd3d077a6b0f-tmp\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:00.993611 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.993571 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:00.993611 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.993606 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:00.993788 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.993620 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:00.993788 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:00.993704 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:01.493681772 +0000 UTC m=+3.107053288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:00.994464 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.994386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" event={"ID":"eb7e593db137b9f51b2a7a346953aea0","Type":"ContainerStarted","Data":"b3330e075257ac9085af6340cc7b29efd31bc9e1a7b191444eeeb10f27fb18ff"} Apr 21 00:03:00.996920 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.996893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkjk\" (UniqueName: \"kubernetes.io/projected/75287a4a-4503-4680-a428-655e61b86e85-kube-api-access-9wkjk\") pod \"iptables-alerter-xkz4p\" (UID: \"75287a4a-4503-4680-a428-655e61b86e85\") " pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:00.997335 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.997312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw86\" (UniqueName: \"kubernetes.io/projected/173d74c8-1f07-4764-a03f-8091e02dc212-kube-api-access-2gw86\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:00.997335 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.997329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-528gb\" (UniqueName: \"kubernetes.io/projected/b578ee7e-4063-4906-b849-ca0d856e3c15-kube-api-access-528gb\") pod \"multus-hvbch\" (UID: \"b578ee7e-4063-4906-b849-ca0d856e3c15\") " pod="openshift-multus/multus-hvbch" Apr 21 00:03:00.997567 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.997540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" event={"ID":"b1cf2d95e85fe20270acb11fb3142e37","Type":"ContainerStarted","Data":"cc8cd87088d32fa05598eeb1f9b4166c532a4216179d83eb4bf21e6f95a9682f"} Apr 21 00:03:00.998511 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:00.998456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gt5x\" (UniqueName: \"kubernetes.io/projected/dbf6c72e-bf6b-47f7-bac9-1b24d4a37975-kube-api-access-2gt5x\") pod \"node-ca-9hvww\" (UID: \"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975\") " pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:01.000510 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.000470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl867\" (UniqueName: \"kubernetes.io/projected/012b4bee-5b6f-4bec-9704-f110e7aba3eb-kube-api-access-hl867\") pod \"ovnkube-node-2nrh8\" (UID: \"012b4bee-5b6f-4bec-9704-f110e7aba3eb\") " pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:01.001056 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.001033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbm2\" (UniqueName: \"kubernetes.io/projected/fe295176-34bd-4593-a32e-cd3d077a6b0f-kube-api-access-khbm2\") pod \"tuned-xxv48\" (UID: \"fe295176-34bd-4593-a32e-cd3d077a6b0f\") " pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:01.002139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.002115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx6nn\" (UniqueName: \"kubernetes.io/projected/441c3a92-f5bd-446c-beb0-d70639bbb401-kube-api-access-nx6nn\") pod \"aws-ebs-csi-driver-node-lcdmd\" (UID: \"441c3a92-f5bd-446c-beb0-d70639bbb401\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:01.026984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.026941 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:03:01.090382 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4j9\" (UniqueName: \"kubernetes.io/projected/f398c142-f284-48f9-b608-6eb7229425ae-kube-api-access-7p4j9\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-os-release\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c972b\" (UniqueName: \"kubernetes.io/projected/918f7c2d-8780-4291-b141-5fb77d94b6cf-kube-api-access-c972b\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-cnibin\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/918f7c2d-8780-4291-b141-5fb77d94b6cf-tmp-dir\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-system-cni-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090594 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-os-release\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/918f7c2d-8780-4291-b141-5fb77d94b6cf-hosts-file\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-cnibin\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/918f7c2d-8780-4291-b141-5fb77d94b6cf-hosts-file\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090865 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f398c142-f284-48f9-b608-6eb7229425ae-system-cni-dir\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.090977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.090924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/918f7c2d-8780-4291-b141-5fb77d94b6cf-tmp-dir\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.091614 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.091028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.091614 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.091352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.091614 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.091352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f398c142-f284-48f9-b608-6eb7229425ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.099338 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.099253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c972b\" (UniqueName: \"kubernetes.io/projected/918f7c2d-8780-4291-b141-5fb77d94b6cf-kube-api-access-c972b\") pod \"node-resolver-gwszj\" (UID: \"918f7c2d-8780-4291-b141-5fb77d94b6cf\") " pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.099469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.099378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4j9\" (UniqueName: \"kubernetes.io/projected/f398c142-f284-48f9-b608-6eb7229425ae-kube-api-access-7p4j9\") pod \"multus-additional-cni-plugins-mghn5\" (UID: \"f398c142-f284-48f9-b608-6eb7229425ae\") " pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.170499 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.170463 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xxv48" Apr 21 00:03:01.180306 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.180278 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xkz4p" Apr 21 00:03:01.188012 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.187990 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:01.194854 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.194820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" Apr 21 00:03:01.201575 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.201552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9hvww" Apr 21 00:03:01.210303 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.210253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hvbch" Apr 21 00:03:01.217368 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.217345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:01.226001 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.225979 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mghn5" Apr 21 00:03:01.232584 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.232568 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gwszj" Apr 21 00:03:01.493799 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.493695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:01.493799 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.493766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.493852 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.493909 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.493930 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.493944 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.493930 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:02.493911159 +0000 UTC m=+4.107282672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:01.494024 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:01.494003 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:02.493987833 +0000 UTC m=+4.107359332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:01.855657 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.855474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf398c142_f284_48f9_b608_6eb7229425ae.slice/crio-bc6be708892610860db8a5dafdbda3f0808236c05a711c205dd038e495e1ebff WatchSource:0}: Error finding container bc6be708892610860db8a5dafdbda3f0808236c05a711c205dd038e495e1ebff: Status 404 returned error can't find the container with id bc6be708892610860db8a5dafdbda3f0808236c05a711c205dd038e495e1ebff Apr 21 00:03:01.857172 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.857146 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918f7c2d_8780_4291_b141_5fb77d94b6cf.slice/crio-a8ed6738b31df00fb0f9dbcc3539b6e48d67ec7f3ed2d0fbf599dcbb5bfade5d WatchSource:0}: Error finding container a8ed6738b31df00fb0f9dbcc3539b6e48d67ec7f3ed2d0fbf599dcbb5bfade5d: Status 404 returned error can't find the container with id a8ed6738b31df00fb0f9dbcc3539b6e48d67ec7f3ed2d0fbf599dcbb5bfade5d Apr 21 00:03:01.860786 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.860767 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728512b6_8990_45f9_b0fa_89772f9c1362.slice/crio-48d45412af634f80e236ec62e0982730b7a180c720de8daddd911b819d9e3614 WatchSource:0}: Error finding container 48d45412af634f80e236ec62e0982730b7a180c720de8daddd911b819d9e3614: Status 404 returned error can't find the container with id 48d45412af634f80e236ec62e0982730b7a180c720de8daddd911b819d9e3614 Apr 21 00:03:01.862142 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.862120 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012b4bee_5b6f_4bec_9704_f110e7aba3eb.slice/crio-e12e0e97495aa9bb7004258eda7e2f74ce0b3ce29daea4af102e3f32d2bcf15d WatchSource:0}: Error finding container e12e0e97495aa9bb7004258eda7e2f74ce0b3ce29daea4af102e3f32d2bcf15d: Status 404 returned error can't find the container with id e12e0e97495aa9bb7004258eda7e2f74ce0b3ce29daea4af102e3f32d2bcf15d Apr 21 00:03:01.863577 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.863552 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441c3a92_f5bd_446c_beb0_d70639bbb401.slice/crio-5c70d9151b51d79ae73580d67b231c2d9672f23ed8b9ec6ffb4f740eba19489e WatchSource:0}: Error finding container 5c70d9151b51d79ae73580d67b231c2d9672f23ed8b9ec6ffb4f740eba19489e: Status 404 returned error can't find the container with id 5c70d9151b51d79ae73580d67b231c2d9672f23ed8b9ec6ffb4f740eba19489e Apr 21 00:03:01.863825 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.863801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf6c72e_bf6b_47f7_bac9_1b24d4a37975.slice/crio-1201ca513f2851475f500ca096a81d656fd48676fab96304b7b4904fe3347140 WatchSource:0}: Error finding container 1201ca513f2851475f500ca096a81d656fd48676fab96304b7b4904fe3347140: Status 404 returned error can't find the container with id 1201ca513f2851475f500ca096a81d656fd48676fab96304b7b4904fe3347140 Apr 21 00:03:01.865176 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.865154 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe295176_34bd_4593_a32e_cd3d077a6b0f.slice/crio-8670ae51701c72e00f3029bd529372a345c9c87f37d3d1baf939638aa17e1e96 WatchSource:0}: Error finding container 8670ae51701c72e00f3029bd529372a345c9c87f37d3d1baf939638aa17e1e96: Status 404 returned error can't find the container with id 8670ae51701c72e00f3029bd529372a345c9c87f37d3d1baf939638aa17e1e96 Apr 21 00:03:01.865973 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.865942 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb578ee7e_4063_4906_b849_ca0d856e3c15.slice/crio-10b3aa3f2dc15c69724a1440e912e9e5a6ef8fcb70737f3d6247333ac93ec381 WatchSource:0}: Error finding container 10b3aa3f2dc15c69724a1440e912e9e5a6ef8fcb70737f3d6247333ac93ec381: Status 404 returned error can't find the container with id 10b3aa3f2dc15c69724a1440e912e9e5a6ef8fcb70737f3d6247333ac93ec381 Apr 21 00:03:01.866993 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:01.866729 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75287a4a_4503_4680_a428_655e61b86e85.slice/crio-7aa0f1f6ef552f2fcb8c84c59c3828b684ca231d1b002c4f8df431d8af59006b WatchSource:0}: Error finding container 7aa0f1f6ef552f2fcb8c84c59c3828b684ca231d1b002c4f8df431d8af59006b: Status 404 returned error can't find the container with id 7aa0f1f6ef552f2fcb8c84c59c3828b684ca231d1b002c4f8df431d8af59006b Apr 21 00:03:01.917777 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.917748 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:57:59 +0000 UTC" deadline="2027-09-26 16:23:24.99900001 +0000 UTC" Apr 21 00:03:01.917777 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.917775 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12568h20m23.081228024s" Apr 21 00:03:02.000004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:01.999975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9hvww" event={"ID":"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975","Type":"ContainerStarted","Data":"1201ca513f2851475f500ca096a81d656fd48676fab96304b7b4904fe3347140"} Apr 21 00:03:02.000873 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.000842 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gwszj" event={"ID":"918f7c2d-8780-4291-b141-5fb77d94b6cf","Type":"ContainerStarted","Data":"a8ed6738b31df00fb0f9dbcc3539b6e48d67ec7f3ed2d0fbf599dcbb5bfade5d"} Apr 21 00:03:02.002310 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.002287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" event={"ID":"b1cf2d95e85fe20270acb11fb3142e37","Type":"ContainerStarted","Data":"b481a2f766f67bd9752dca887c2b0f00ba9037cde3d5618b38d3ae8a3e6d51e3"} Apr 21 00:03:02.003299 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.003277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xkz4p" event={"ID":"75287a4a-4503-4680-a428-655e61b86e85","Type":"ContainerStarted","Data":"7aa0f1f6ef552f2fcb8c84c59c3828b684ca231d1b002c4f8df431d8af59006b"} Apr 21 00:03:02.004204 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.004185 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hvbch" event={"ID":"b578ee7e-4063-4906-b849-ca0d856e3c15","Type":"ContainerStarted","Data":"10b3aa3f2dc15c69724a1440e912e9e5a6ef8fcb70737f3d6247333ac93ec381"} Apr 21 00:03:02.006116 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.005767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xxv48" event={"ID":"fe295176-34bd-4593-a32e-cd3d077a6b0f","Type":"ContainerStarted","Data":"8670ae51701c72e00f3029bd529372a345c9c87f37d3d1baf939638aa17e1e96"} Apr 21 00:03:02.007220 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.007192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" event={"ID":"441c3a92-f5bd-446c-beb0-d70639bbb401","Type":"ContainerStarted","Data":"5c70d9151b51d79ae73580d67b231c2d9672f23ed8b9ec6ffb4f740eba19489e"} Apr 21 00:03:02.008139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.008118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"e12e0e97495aa9bb7004258eda7e2f74ce0b3ce29daea4af102e3f32d2bcf15d"} Apr 21 00:03:02.009044 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.009022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zsbmm" event={"ID":"728512b6-8990-45f9-b0fa-89772f9c1362","Type":"ContainerStarted","Data":"48d45412af634f80e236ec62e0982730b7a180c720de8daddd911b819d9e3614"} Apr 21 00:03:02.010368 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.010347 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerStarted","Data":"bc6be708892610860db8a5dafdbda3f0808236c05a711c205dd038e495e1ebff"} Apr 21 00:03:02.015858 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.015823 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-115.ec2.internal" podStartSLOduration=2.015813885 podStartE2EDuration="2.015813885s" podCreationTimestamp="2026-04-21 00:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:03:02.015665342 +0000 UTC m=+3.629036865" watchObservedRunningTime="2026-04-21 00:03:02.015813885 +0000 UTC m=+3.629185406" Apr 21 00:03:02.502155 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.502113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:02.502557 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.502198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:02.502557 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502347 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:02.502557 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502408 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:04.502390211 +0000 UTC m=+6.115761724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:02.502872 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502853 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:02.502923 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502879 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:02.502923 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502892 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:02.503013 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.502940 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:04.50292457 +0000 UTC m=+6.116296089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:02.916872 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.916837 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 00:03:02.987705 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.987627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:02.987869 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.987772 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:02.988232 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:02.988211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:02.988328 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:02.988310 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:03.024129 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:03.023427 2571 generic.go:358] "Generic (PLEG): container finished" podID="eb7e593db137b9f51b2a7a346953aea0" containerID="325e6fa6c1cdc2ee3eb1ef7dc2572e2d2a88281b875a0eef5821f11ae8910d8a" exitCode=0 Apr 21 00:03:03.024129 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:03.023603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" event={"ID":"eb7e593db137b9f51b2a7a346953aea0","Type":"ContainerDied","Data":"325e6fa6c1cdc2ee3eb1ef7dc2572e2d2a88281b875a0eef5821f11ae8910d8a"} Apr 21 00:03:04.046381 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.046337 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" event={"ID":"eb7e593db137b9f51b2a7a346953aea0","Type":"ContainerStarted","Data":"24eb279e36ffb026a0e20f342f1aa2cb2a988c4cd8f63e4e30be5fd3a38f646f"} Apr 21 00:03:04.061848 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.061768 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-115.ec2.internal" podStartSLOduration=4.061747841 podStartE2EDuration="4.061747841s" podCreationTimestamp="2026-04-21 00:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:03:04.060645521 +0000 UTC m=+5.674017049" watchObservedRunningTime="2026-04-21 00:03:04.061747841 +0000 UTC m=+5.675119363" Apr 21 00:03:04.528303 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.528265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:04.528492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.528341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:04.528492 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.528471 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:04.528600 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.528533 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:08.528514309 +0000 UTC m=+10.141885815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:04.528961 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.528941 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:04.529042 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.528965 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:04.529042 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.528990 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:04.529042 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.529039 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:08.52902283 +0000 UTC m=+10.142394330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:04.986070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.985991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:04.986227 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.986112 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:04.986227 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:04.986135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:04.986356 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:04.986256 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:06.893322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.893286 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8n24l"] Apr 21 00:03:06.895232 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.895184 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:06.895365 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:06.895261 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:06.947544 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.947428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-dbus\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:06.947544 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.947519 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:06.947763 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.947591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-kubelet-config\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:06.986153 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.985778 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:06.986153 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:06.985888 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:06.986363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:06.986259 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:06.986418 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:06.986377 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.048360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.048408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-kubelet-config\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.048468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-dbus\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.048645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-dbus\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:07.048750 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:07.048820 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:07.548800561 +0000 UTC m=+9.162172075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:07.049004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.048847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4c3f54e-8135-4a92-b7dc-1bef279e0201-kubelet-config\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.553923 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:07.553875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:07.554203 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:07.554045 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:07.554203 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:07.554142 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:08.554121744 +0000 UTC m=+10.167493249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:08.562057 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.562012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.562074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.562139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562268 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562335 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:16.562315885 +0000 UTC m=+18.175687386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562331 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:10.562380814 +0000 UTC m=+12.175752330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562416 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562430 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562443 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:08.562594 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.562481 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:16.562468412 +0000 UTC m=+18.175839918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:08.986990 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.986541 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:08.986990 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.986660 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:08.986990 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.986772 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:08.986990 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:08.986890 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:08.986990 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.986936 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:08.987435 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:08.987023 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:10.577362 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:10.577321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:10.577822 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:10.577487 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:10.577822 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:10.577572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:14.577540044 +0000 UTC m=+16.190911570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:10.985757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:10.985719 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:10.985909 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:10.985714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:10.985909 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:10.985850 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:10.986023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:10.985909 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:10.986023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:10.985922 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:10.986023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:10.986007 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:12.987894 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:12.987867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:12.988389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:12.987867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:12.988389 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:12.987978 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:12.988389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:12.987864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:12.988389 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:12.988043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:12.988389 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:12.988159 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:14.602781 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:14.602748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:14.603176 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:14.602885 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:14.603176 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:14.602942 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:22.602927673 +0000 UTC m=+24.216299178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:14.988328 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:14.988247 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:14.988473 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:14.988248 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:14.988473 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:14.988375 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:14.988473 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:14.988248 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:14.988623 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:14.988452 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:14.988623 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:14.988543 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:16.617282 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:16.617238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:16.617318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617410 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617413 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617434 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617445 2571 projected.go:194] Error preparing data for projected volume kube-api-access-tc5d7 for pod openshift-network-diagnostics/network-check-target-s2mw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617465 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.617450339 +0000 UTC m=+34.230821839 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 00:03:16.617704 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.617494 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7 podName:a10f7678-f6da-46bb-86eb-c0de2afb421c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.617478261 +0000 UTC m=+34.230849759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tc5d7" (UniqueName: "kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7") pod "network-check-target-s2mw9" (UID: "a10f7678-f6da-46bb-86eb-c0de2afb421c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 00:03:16.985466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:16.985391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:16.985620 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:16.985390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:16.985620 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.985528 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:16.985620 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:16.985390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:16.985779 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.985626 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:16.985779 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:16.985708 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:18.985415 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:18.985229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:18.985754 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:18.985306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:18.985754 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:18.985319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:18.986455 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:18.986424 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:18.986876 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:18.986850 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:18.986964 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:18.986942 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:19.082550 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:19.082513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xxv48" event={"ID":"fe295176-34bd-4593-a32e-cd3d077a6b0f","Type":"ContainerStarted","Data":"37c36fe3f24d11b7b1c29a03a82854f309643f950ea99ea8ee19cb59272b766d"} Apr 21 00:03:19.101066 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:19.100710 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xxv48" podStartSLOduration=3.0865346909999998 podStartE2EDuration="20.100693004s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.867534356 +0000 UTC m=+3.480905867" lastFinishedPulling="2026-04-21 00:03:18.881692679 +0000 UTC m=+20.495064180" observedRunningTime="2026-04-21 00:03:19.099356416 +0000 UTC m=+20.712727937" watchObservedRunningTime="2026-04-21 00:03:19.100693004 +0000 UTC m=+20.714064526" Apr 21 00:03:20.085643 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.085447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" event={"ID":"441c3a92-f5bd-446c-beb0-d70639bbb401","Type":"ContainerStarted","Data":"dc9f6a7744bb2b1ebb4f37d58fcbc1777cb4137a078688f95be90e018d16db8d"} Apr 21 00:03:20.087619 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"8696e8cd701cb4e7f4238d07f0bfb76153a2cfed363706547cc68591968c6fa8"} Apr 21 00:03:20.087721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"bdd78c87a16861cc3506650b39d6d55acefa2d5976e44e5cde12b875c8aacd4c"} Apr 21 00:03:20.087721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"f7acf6a27632c92f0899bc0d731e41119a8dcceb7e445187d6a67e6854e31632"} Apr 21 00:03:20.087721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"c7109b958b1571e76c1f21663a0b87715a5ba3f77c2572b64f83112d0702b5e0"} Apr 21 00:03:20.087721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"27a68373d38f640ed7fa585690d9537814bd57e0726ab8c6ee6cfcdf97128e17"} Apr 21 00:03:20.087721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.087665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"568c5a908cccc5f3a7e64b858d222e6f3fc6b413ddce0f0aa71a73127f8a9438"} Apr 21 00:03:20.088747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.088725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zsbmm" event={"ID":"728512b6-8990-45f9-b0fa-89772f9c1362","Type":"ContainerStarted","Data":"ae245e6594ce85386f9f03f9650a91fd5b0f4ea6df7dc465635036fe0ec19c83"} Apr 21 00:03:20.090041 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.090015 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="2e12829705d5ba457aad28a3587aab1a8cce4b037888a702f6be7121f710291d" exitCode=0 Apr 21 00:03:20.090157 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.090104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"2e12829705d5ba457aad28a3587aab1a8cce4b037888a702f6be7121f710291d"} Apr 21 00:03:20.091283 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.091265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9hvww" event={"ID":"dbf6c72e-bf6b-47f7-bac9-1b24d4a37975","Type":"ContainerStarted","Data":"f90d92d2ecb284131b56feb2d8c2916854ff73f3c7633f191a84aca0333c113b"} Apr 21 00:03:20.092440 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.092421 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gwszj" event={"ID":"918f7c2d-8780-4291-b141-5fb77d94b6cf","Type":"ContainerStarted","Data":"46048280760ce6d259a5b767e8e2f3961199d74ec26468e3b9d54232356f4e22"} Apr 21 00:03:20.093623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.093604 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hvbch" event={"ID":"b578ee7e-4063-4906-b849-ca0d856e3c15","Type":"ContainerStarted","Data":"f64a2fc40c881eb757d833a5626e39878a5d8ababe269e84767a5dce325b63cd"} Apr 21 00:03:20.115016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.114967 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zsbmm" podStartSLOduration=4.095914789 podStartE2EDuration="21.114951673s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.862542896 +0000 UTC m=+3.475914399" lastFinishedPulling="2026-04-21 00:03:18.881579767 +0000 UTC m=+20.494951283" observedRunningTime="2026-04-21 00:03:20.103011154 +0000 UTC m=+21.716382686" watchObservedRunningTime="2026-04-21 00:03:20.114951673 +0000 UTC m=+21.728323193" Apr 21 00:03:20.115301 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.115267 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9hvww" podStartSLOduration=4.099886237 podStartE2EDuration="21.115257752s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.866193346 +0000 UTC m=+3.479564848" lastFinishedPulling="2026-04-21 00:03:18.881564852 +0000 UTC m=+20.494936363" observedRunningTime="2026-04-21 00:03:20.114677774 +0000 UTC m=+21.728049293" watchObservedRunningTime="2026-04-21 00:03:20.115257752 +0000 UTC m=+21.728629274" Apr 21 00:03:20.129547 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.129505 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hvbch" podStartSLOduration=4.062257207 podStartE2EDuration="21.129490729s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.867838182 +0000 UTC m=+3.481209681" lastFinishedPulling="2026-04-21 00:03:18.935071704 +0000 UTC m=+20.548443203" observedRunningTime="2026-04-21 00:03:20.129294278 +0000 UTC m=+21.742665800" watchObservedRunningTime="2026-04-21 00:03:20.129490729 +0000 UTC m=+21.742862255" Apr 21 00:03:20.144635 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.144587 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gwszj" podStartSLOduration=4.122017686 podStartE2EDuration="21.144569913s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.859292707 +0000 UTC m=+3.472664217" lastFinishedPulling="2026-04-21 00:03:18.881844945 +0000 UTC m=+20.495216444" observedRunningTime="2026-04-21 00:03:20.144078119 +0000 UTC m=+21.757449642" watchObservedRunningTime="2026-04-21 00:03:20.144569913 +0000 UTC m=+21.757941435" Apr 21 00:03:20.262476 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.262455 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 00:03:20.965050 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.964941 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T00:03:20.262472246Z","UUID":"4907a715-bbb9-4660-8102-7b0ad8040167","Handler":null,"Name":"","Endpoint":""} Apr 21 00:03:20.968280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.968256 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 00:03:20.968400 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.968288 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 00:03:20.985224 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.985200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:20.985362 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.985209 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:20.985362 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:20.985322 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:20.985478 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:20.985207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:20.986496 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:20.986466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:20.986496 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:20.985804 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:21.097208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:21.097165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xkz4p" event={"ID":"75287a4a-4503-4680-a428-655e61b86e85","Type":"ContainerStarted","Data":"088928d0f428b8a3e76adc589fcc2d88410a7474568de6e666ea5e03f38f21f1"} Apr 21 00:03:21.099685 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:21.099649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" event={"ID":"441c3a92-f5bd-446c-beb0-d70639bbb401","Type":"ContainerStarted","Data":"c870807d93f9fa2fbfa5756c19a220fd19bc71959a6b7987fa5a7942bc262f5a"} Apr 21 00:03:21.111172 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:21.111129 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xkz4p" podStartSLOduration=5.098237029 podStartE2EDuration="22.111115155s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.868717196 +0000 UTC m=+3.482088694" lastFinishedPulling="2026-04-21 00:03:18.881595317 +0000 UTC m=+20.494966820" observedRunningTime="2026-04-21 00:03:21.110422626 +0000 UTC m=+22.723794148" watchObservedRunningTime="2026-04-21 00:03:21.111115155 +0000 UTC m=+22.724486673" Apr 21 00:03:22.103878 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.103643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" event={"ID":"441c3a92-f5bd-446c-beb0-d70639bbb401","Type":"ContainerStarted","Data":"a2093ddce53de2463a38556b62921b0368abce9698057d1923aa5c6d02a41521"} Apr 21 00:03:22.106996 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.106969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"32b42497631f25434f6515d047f3d90994f37f6d273374679f82aa96619f6d19"} Apr 21 00:03:22.120048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.120003 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lcdmd" podStartSLOduration=3.799722014 podStartE2EDuration="23.119990526s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.865006751 +0000 UTC m=+3.478378263" lastFinishedPulling="2026-04-21 00:03:21.185275261 +0000 UTC m=+22.798646775" observedRunningTime="2026-04-21 00:03:22.119941215 +0000 UTC m=+23.733312737" watchObservedRunningTime="2026-04-21 00:03:22.119990526 +0000 UTC m=+23.733362046" Apr 21 00:03:22.662198 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.662160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:22.662373 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:22.662303 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:22.662373 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:22.662369 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret podName:d4c3f54e-8135-4a92-b7dc-1bef279e0201 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:38.662354353 +0000 UTC m=+40.275725852 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret") pod "global-pull-secret-syncer-8n24l" (UID: "d4c3f54e-8135-4a92-b7dc-1bef279e0201") : object "kube-system"/"original-pull-secret" not registered Apr 21 00:03:22.985346 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.985271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:22.985501 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.985276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:22.985501 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:22.985401 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:22.985501 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:22.985276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:22.985501 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:22.985488 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:22.985697 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:22.985572 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:24.172848 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:24.172685 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:24.176355 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:24.173852 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:24.985687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:24.985649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:24.985687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:24.985676 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:24.985936 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:24.985752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:24.985936 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:24.985763 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:24.985936 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:24.985846 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:24.985936 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:24.985884 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:25.116131 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.116085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" event={"ID":"012b4bee-5b6f-4bec-9704-f110e7aba3eb","Type":"ContainerStarted","Data":"fb2a1cc45b4879a419f0c48c2c6ec14af3db26bc7c39f3b603f8a142417dcf26"} Apr 21 00:03:25.116406 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.116387 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:25.116518 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.116416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:25.117846 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.117821 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="856c4f69f3d1ef5e5b58d40e92b28c3f0c7736c2ebf4ad9e60a716f5f73a4ab6" exitCode=0 Apr 21 00:03:25.117934 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.117848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"856c4f69f3d1ef5e5b58d40e92b28c3f0c7736c2ebf4ad9e60a716f5f73a4ab6"} Apr 21 00:03:25.118206 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.118180 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:25.118536 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.118512 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zsbmm" Apr 21 00:03:25.130846 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.130822 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:25.141220 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:25.141187 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" podStartSLOduration=8.705631364 podStartE2EDuration="26.141175761s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.864659685 +0000 UTC m=+3.478031187" lastFinishedPulling="2026-04-21 00:03:19.300204085 +0000 UTC m=+20.913575584" observedRunningTime="2026-04-21 00:03:25.140731005 +0000 UTC m=+26.754102527" watchObservedRunningTime="2026-04-21 00:03:25.141175761 +0000 UTC m=+26.754547281" Apr 21 00:03:26.086836 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.086807 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6fz9j"] Apr 21 00:03:26.087209 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.086950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:26.087209 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:26.087040 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:26.089574 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.089543 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8n24l"] Apr 21 00:03:26.089678 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.089645 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:26.089763 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:26.089741 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:26.090349 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.090319 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s2mw9"] Apr 21 00:03:26.090429 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.090421 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:26.090534 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:26.090513 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:26.125392 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.125361 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerStarted","Data":"7f25220afc2718234f18a2c1e15ad54575e782146e6ac49441b1bbcc6e0f3af5"} Apr 21 00:03:26.126584 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.126128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:26.140347 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:26.140328 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:03:27.129302 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.129212 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="7f25220afc2718234f18a2c1e15ad54575e782146e6ac49441b1bbcc6e0f3af5" exitCode=0 Apr 21 00:03:27.129302 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.129241 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="55213726dfa6dafa257365569f8101c953d92c5b8b9b3a7882e1253c76f6472a" exitCode=0 Apr 21 00:03:27.129302 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.129249 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"7f25220afc2718234f18a2c1e15ad54575e782146e6ac49441b1bbcc6e0f3af5"} Apr 21 00:03:27.129302 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.129286 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"55213726dfa6dafa257365569f8101c953d92c5b8b9b3a7882e1253c76f6472a"} Apr 21 00:03:27.985933 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.985711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:27.986088 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.985761 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:27.986088 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:27.986055 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:27.986219 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:27.985778 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:27.986219 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:27.986164 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:27.986307 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:27.986233 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:29.985938 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:29.985856 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:29.986674 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:29.985967 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8n24l" podUID="d4c3f54e-8135-4a92-b7dc-1bef279e0201" Apr 21 00:03:29.986674 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:29.986292 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:29.986674 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:29.986365 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2mw9" podUID="a10f7678-f6da-46bb-86eb-c0de2afb421c" Apr 21 00:03:29.986674 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:29.986417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:29.986674 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:29.986533 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6fz9j" podUID="173d74c8-1f07-4764-a03f-8091e02dc212" Apr 21 00:03:31.662046 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.662016 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-115.ec2.internal" event="NodeReady" Apr 21 00:03:31.662457 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.662178 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 00:03:31.694071 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.694034 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:03:31.720866 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.720830 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29612160-ghs4n"] Apr 21 00:03:31.721217 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.721189 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.726317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.726173 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 00:03:31.726317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.726178 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 00:03:31.726317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.726310 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 00:03:31.732414 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.732394 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pctqm\"" Apr 21 00:03:31.735401 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.735375 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29612160-ghs4n"] Apr 21 00:03:31.735401 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.735405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:03:31.735554 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.735416 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr"] Apr 21 00:03:31.735639 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.735619 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.736540 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.736505 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 00:03:31.740981 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.740963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 21 00:03:31.755373 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.755348 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt"] Apr 21 00:03:31.755856 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.755835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:31.759040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.759017 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 00:03:31.759434 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.759413 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.759492 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.759442 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cznvh\"" Apr 21 00:03:31.759729 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.759710 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.784153 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.784124 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z"] Apr 21 00:03:31.784317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.784293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" Apr 21 00:03:31.788537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.788513 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rzxpw\"" Apr 21 00:03:31.788864 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.788843 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.789250 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.789231 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.805874 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.805852 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq"] Apr 21 00:03:31.806035 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.806016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:31.808490 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.808470 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 00:03:31.808589 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.808522 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fjh5b\"" Apr 21 00:03:31.808652 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.808590 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 00:03:31.818805 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.818784 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5"] Apr 21 00:03:31.818960 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.818941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:31.821889 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.821869 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 00:03:31.821987 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.821927 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.822286 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.822262 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.823452 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.823432 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xljqt\"" Apr 21 00:03:31.823554 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.823456 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 00:03:31.834419 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.834530 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834530 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ctx\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834639 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834639 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834593 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834745 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834650 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6btc\" (UniqueName: \"kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.834745 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834847 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834900 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.834900 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.834885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.835687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.835666 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69675cd558-ml7nv"] Apr 21 00:03:31.835842 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.835817 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" Apr 21 00:03:31.839032 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.839010 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7jg6t\"" Apr 21 00:03:31.839032 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.839025 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.839352 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.839322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.854312 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.854291 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4"] Apr 21 00:03:31.854402 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.854355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:31.857147 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.856957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 00:03:31.857147 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857000 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-h76vq\"" Apr 21 00:03:31.857147 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857110 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.857449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857391 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 00:03:31.857539 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857503 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.857602 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857538 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 00:03:31.857669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.857652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 00:03:31.874632 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.874612 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw"] Apr 21 00:03:31.874793 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.874771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:31.877479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.877405 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.877479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.877432 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 00:03:31.877479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.877438 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 00:03:31.877784 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.877764 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-pl24q\"" Apr 21 00:03:31.878026 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.878011 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.886728 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.886680 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4dln"] Apr 21 00:03:31.886855 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.886838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:31.890040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.889854 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 00:03:31.890040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.889903 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.890040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.889906 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 00:03:31.890040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.889982 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.906309 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.906290 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7"] Apr 21 00:03:31.906456 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.906439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:31.910508 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.910488 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 00:03:31.910666 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.910644 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-4jdhf\"" Apr 21 00:03:31.910734 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.910695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 00:03:31.910734 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.910655 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.910840 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.910801 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.917488 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.917468 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 00:03:31.929185 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.929157 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj"] Apr 21 00:03:31.929586 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.929564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:31.932199 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.932176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 00:03:31.932455 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.932437 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-wpmmz\"" Apr 21 00:03:31.935801 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.935896 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.935896 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.935896 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:31.935876 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:31.935904 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmks\" (UniqueName: \"kubernetes.io/projected/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-kube-api-access-mgmks\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:31.935971 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.435952014 +0000 UTC m=+34.049323529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.935990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:31.936048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvcqm\" (UniqueName: \"kubernetes.io/projected/fb84b042-37b8-43f5-94e4-43fb54d2041b-kube-api-access-pvcqm\") pod \"network-check-source-8894fc9bd-jtsx5\" (UID: \"fb84b042-37b8-43f5-94e4-43fb54d2041b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vpz\" (UniqueName: \"kubernetes.io/projected/a2f4bd8a-7dae-45c0-9b1e-a5c145a09876-kube-api-access-r6vpz\") pod \"volume-data-source-validator-7c6cbb6c87-d4kjt\" (UID: \"a2f4bd8a-7dae-45c0-9b1e-a5c145a09876\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rcv\" (UniqueName: \"kubernetes.io/projected/4854cd30-d0eb-4603-8e32-c7919b625f6c-kube-api-access-s2rcv\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4854cd30-d0eb-4603-8e32-c7919b625f6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ctx\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936513 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.936841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:31.936841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6btc\" (UniqueName: \"kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.936841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:31.937239 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.936896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.937616 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.937598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.938004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.937981 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.941046 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.940983 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.941046 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.941014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.945380 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.945357 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.946491 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.946421 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ctx\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:31.946632 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.946582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6btc\" (UniqueName: \"kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc\") pod \"image-pruner-29612160-ghs4n\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:31.949580 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.949558 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p9l6p"] Apr 21 00:03:31.949698 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.949683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:31.952265 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.952242 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 00:03:31.952265 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.952266 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 00:03:31.952411 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.952368 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 00:03:31.952743 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.952729 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 00:03:31.970465 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.970446 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9"] Apr 21 00:03:31.970620 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.970587 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:31.973438 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.973405 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.973538 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.973456 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 00:03:31.973538 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.973474 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.973538 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.973482 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 00:03:31.973690 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.973410 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-fkrww\"" Apr 21 00:03:31.979273 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.979253 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 00:03:31.986525 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.986507 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nxkjd"] Apr 21 00:03:31.986644 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.986611 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:31.986644 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.986612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:31.986770 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.986649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:31.986872 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.986854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:31.989253 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.989233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-24wfg\"" Apr 21 00:03:31.989348 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.989330 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 00:03:31.989800 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.989783 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 00:03:31.989987 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.989970 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 00:03:31.990131 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.990110 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 00:03:31.990358 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.990343 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-d98l9\"" Apr 21 00:03:31.990556 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.990496 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 00:03:31.990920 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.990522 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 00:03:31.990920 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:31.990581 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-n52qc\"" Apr 21 00:03:32.005216 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.005084 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6w5rw"] Apr 21 00:03:32.005341 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.005309 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.009453 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.009429 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 00:03:32.010913 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.009712 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 00:03:32.011129 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.011078 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-79cql\"" Apr 21 00:03:32.025970 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.025953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z"] Apr 21 00:03:32.026191 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.026431 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026185 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5"] Apr 21 00:03:32.026516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026466 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt"] Apr 21 00:03:32.026516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw"] Apr 21 00:03:32.026516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026495 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr"] Apr 21 00:03:32.026516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq"] Apr 21 00:03:32.026516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026517 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6w5rw"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026529 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p9l6p"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026544 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nxkjd"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026554 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69675cd558-ml7nv"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026565 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026576 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4dln"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026586 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026596 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7"] Apr 21 00:03:32.026806 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.026606 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj"] Apr 21 00:03:32.029867 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.029845 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 00:03:32.029867 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.029864 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 00:03:32.029996 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.029847 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2glpj\"" Apr 21 00:03:32.030049 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.030006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 00:03:32.036932 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.036912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:32.037027 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.036966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vvrc\" (UniqueName: \"kubernetes.io/projected/e4103e8a-c222-4304-953c-43f57e73acef-kube-api-access-7vvrc\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.037027 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.037155 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.037155 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037053 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:32.037155 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037051 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.037155 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037138 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.537118747 +0000 UTC m=+34.150490249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvcqm\" (UniqueName: \"kubernetes.io/projected/fb84b042-37b8-43f5-94e4-43fb54d2041b-kube-api-access-pvcqm\") pod \"network-check-source-8894fc9bd-jtsx5\" (UID: \"fb84b042-37b8-43f5-94e4-43fb54d2041b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-default-certificate\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-tmp\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.037371 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037340 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037371 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vpz\" (UniqueName: \"kubernetes.io/projected/a2f4bd8a-7dae-45c0-9b1e-a5c145a09876-kube-api-access-r6vpz\") pod \"volume-data-source-validator-7c6cbb6c87-d4kjt\" (UID: \"a2f4bd8a-7dae-45c0-9b1e-a5c145a09876\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037405 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.537387199 +0000 UTC m=+34.150758717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rcv\" (UniqueName: \"kubernetes.io/projected/4854cd30-d0eb-4603-8e32-c7919b625f6c-kube-api-access-s2rcv\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.037465 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.537453558 +0000 UTC m=+34.150825077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73af315a-43db-4687-aa2e-2555ab2f3d65-serving-cert\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4854cd30-d0eb-4603-8e32-c7919b625f6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-stats-auth\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7060df7d-1449-4026-9723-09376f46de81-klusterlet-config\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.037655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2d04f98f-58fd-479e-8553-0b49ee6f1c58-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxnvv\" (UniqueName: \"kubernetes.io/projected/7060df7d-1449-4026-9723-09376f46de81-kube-api-access-qxnvv\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/010e4dff-e2ae-4168-aa42-10e2537edc3c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-snapshots\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/010e4dff-e2ae-4168-aa42-10e2537edc3c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wst\" (UniqueName: \"kubernetes.io/projected/010e4dff-e2ae-4168-aa42-10e2537edc3c-kube-api-access-w9wst\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.037994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmks\" (UniqueName: \"kubernetes.io/projected/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-kube-api-access-mgmks\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038042 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hhdm\" (UniqueName: \"kubernetes.io/projected/73af315a-43db-4687-aa2e-2555ab2f3d65-kube-api-access-8hhdm\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf689\" (UniqueName: \"kubernetes.io/projected/2d04f98f-58fd-479e-8553-0b49ee6f1c58-kube-api-access-sf689\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.038110 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7060df7d-1449-4026-9723-09376f46de81-tmp\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.038710 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25ad4b12-b3eb-4705-9146-fc282c21c226-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.038710 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038172 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4ct\" (UniqueName: \"kubernetes.io/projected/25ad4b12-b3eb-4705-9146-fc282c21c226-kube-api-access-8g4ct\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.038710 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4854cd30-d0eb-4603-8e32-c7919b625f6c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.038710 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.038352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:32.044059 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.044038 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:03:32.049765 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.049744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vpz\" (UniqueName: \"kubernetes.io/projected/a2f4bd8a-7dae-45c0-9b1e-a5c145a09876-kube-api-access-r6vpz\") pod \"volume-data-source-validator-7c6cbb6c87-d4kjt\" (UID: \"a2f4bd8a-7dae-45c0-9b1e-a5c145a09876\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" Apr 21 00:03:32.050292 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.050268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvcqm\" (UniqueName: \"kubernetes.io/projected/fb84b042-37b8-43f5-94e4-43fb54d2041b-kube-api-access-pvcqm\") pod \"network-check-source-8894fc9bd-jtsx5\" (UID: \"fb84b042-37b8-43f5-94e4-43fb54d2041b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" Apr 21 00:03:32.050406 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.050392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmks\" (UniqueName: \"kubernetes.io/projected/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-kube-api-access-mgmks\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:32.051185 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.051160 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rcv\" (UniqueName: \"kubernetes.io/projected/4854cd30-d0eb-4603-8e32-c7919b625f6c-kube-api-access-s2rcv\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.095186 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.095146 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" Apr 21 00:03:32.138736 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/010e4dff-e2ae-4168-aa42-10e2537edc3c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.138736 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91dab5b-ef33-493b-9cc9-99941410ef37-serving-cert\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wst\" (UniqueName: \"kubernetes.io/projected/010e4dff-e2ae-4168-aa42-10e2537edc3c-kube-api-access-w9wst\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnsz9\" (UniqueName: \"kubernetes.io/projected/e28815ec-1f97-4757-b463-8aec1ad6b01e-kube-api-access-vnsz9\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.138928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hhdm\" (UniqueName: \"kubernetes.io/projected/73af315a-43db-4687-aa2e-2555ab2f3d65-kube-api-access-8hhdm\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sf689\" (UniqueName: \"kubernetes.io/projected/2d04f98f-58fd-479e-8553-0b49ee6f1c58-kube-api-access-sf689\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.138999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgn64\" (UniqueName: \"kubernetes.io/projected/e91dab5b-ef33-493b-9cc9-99941410ef37-kube-api-access-tgn64\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7060df7d-1449-4026-9723-09376f46de81-tmp\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25ad4b12-b3eb-4705-9146-fc282c21c226-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-config-volume\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-config\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4ct\" (UniqueName: \"kubernetes.io/projected/25ad4b12-b3eb-4705-9146-fc282c21c226-kube-api-access-8g4ct\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vvrc\" (UniqueName: \"kubernetes.io/projected/e4103e8a-c222-4304-953c-43f57e73acef-kube-api-access-7vvrc\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139410 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.139436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a171e5-c09c-4c2d-ad77-c2c5a6027985-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7060df7d-1449-4026-9723-09376f46de81-tmp\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25ad4b12-b3eb-4705-9146-fc282c21c226-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.140016 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.139924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.140320 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.140042 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:32.140320 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.140123 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.640080687 +0000 UTC m=+34.253452202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:32.140786 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.140682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-default-certificate\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.140786 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.140744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-tmp\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141072 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73af315a-43db-4687-aa2e-2555ab2f3d65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73af315a-43db-4687-aa2e-2555ab2f3d65-serving-cert\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-stats-auth\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141306 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-tmp\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7060df7d-1449-4026-9723-09376f46de81-klusterlet-config\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2d04f98f-58fd-479e-8553-0b49ee6f1c58-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-tmp-dir\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/010e4dff-e2ae-4168-aa42-10e2537edc3c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.141966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lpj6\" (UniqueName: \"kubernetes.io/projected/37a171e5-c09c-4c2d-ad77-c2c5a6027985-kube-api-access-7lpj6\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmk49\" (UniqueName: \"kubernetes.io/projected/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-kube-api-access-mmk49\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxnvv\" (UniqueName: \"kubernetes.io/projected/7060df7d-1449-4026-9723-09376f46de81-kube-api-access-qxnvv\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/010e4dff-e2ae-4168-aa42-10e2537edc3c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-trusted-ca\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-snapshots\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.143791 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.142867 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a171e5-c09c-4c2d-ad77-c2c5a6027985-config\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.144576 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.143266 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.643247909 +0000 UTC m=+34.256619428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:32.144576 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.143695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.144576 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.143716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/010e4dff-e2ae-4168-aa42-10e2537edc3c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.144576 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.144259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.144576 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.144273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73af315a-43db-4687-aa2e-2555ab2f3d65-snapshots\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.144853 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.144781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.144853 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.144824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7060df7d-1449-4026-9723-09376f46de81-klusterlet-config\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.145183 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.145011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2d04f98f-58fd-479e-8553-0b49ee6f1c58-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.145183 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.145128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" Apr 21 00:03:32.145455 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.145432 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25ad4b12-b3eb-4705-9146-fc282c21c226-ca\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.145542 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.145522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73af315a-43db-4687-aa2e-2555ab2f3d65-serving-cert\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.145984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.145750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-default-certificate\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.147386 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.147342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-stats-auth\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.149326 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.149302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wst\" (UniqueName: \"kubernetes.io/projected/010e4dff-e2ae-4168-aa42-10e2537edc3c-kube-api-access-w9wst\") pod \"kube-storage-version-migrator-operator-6769c5d45-m59z4\" (UID: \"010e4dff-e2ae-4168-aa42-10e2537edc3c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.150052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.149945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf689\" (UniqueName: \"kubernetes.io/projected/2d04f98f-58fd-479e-8553-0b49ee6f1c58-kube-api-access-sf689\") pod \"managed-serviceaccount-addon-agent-788cb5b5d9-259j7\" (UID: \"2d04f98f-58fd-479e-8553-0b49ee6f1c58\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.150831 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.150787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hhdm\" (UniqueName: \"kubernetes.io/projected/73af315a-43db-4687-aa2e-2555ab2f3d65-kube-api-access-8hhdm\") pod \"insights-operator-585dfdc468-b4dln\" (UID: \"73af315a-43db-4687-aa2e-2555ab2f3d65\") " pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.150919 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.150895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4ct\" (UniqueName: \"kubernetes.io/projected/25ad4b12-b3eb-4705-9146-fc282c21c226-kube-api-access-8g4ct\") pod \"cluster-proxy-proxy-agent-6c6756898-dclgj\" (UID: \"25ad4b12-b3eb-4705-9146-fc282c21c226\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.150979 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.150916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vvrc\" (UniqueName: \"kubernetes.io/projected/e4103e8a-c222-4304-953c-43f57e73acef-kube-api-access-7vvrc\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.152030 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.152011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxnvv\" (UniqueName: \"kubernetes.io/projected/7060df7d-1449-4026-9723-09376f46de81-kube-api-access-qxnvv\") pod \"klusterlet-addon-workmgr-77784c6fb8-jc2kw\" (UID: \"7060df7d-1449-4026-9723-09376f46de81\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.183978 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.183934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" Apr 21 00:03:32.196815 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.196781 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:32.218627 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.218600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b4dln" Apr 21 00:03:32.243528 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnsz9\" (UniqueName: \"kubernetes.io/projected/e28815ec-1f97-4757-b463-8aec1ad6b01e-kube-api-access-vnsz9\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.243528 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.243676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgn64\" (UniqueName: \"kubernetes.io/projected/e91dab5b-ef33-493b-9cc9-99941410ef37-kube-api-access-tgn64\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.243676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-config-volume\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.243676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-config\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.243676 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.243840 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a171e5-c09c-4c2d-ad77-c2c5a6027985-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.243840 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-tmp-dir\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.243840 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.243805 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:32.243840 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lpj6\" (UniqueName: \"kubernetes.io/projected/37a171e5-c09c-4c2d-ad77-c2c5a6027985-kube-api-access-7lpj6\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.244023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.243846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmk49\" (UniqueName: \"kubernetes.io/projected/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-kube-api-access-mmk49\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.244023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.243877 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.743857224 +0000 UTC m=+34.357228735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:32.244023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.243950 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:32.244023 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.244009 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:32.743991142 +0000 UTC m=+34.357362659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:32.244352 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-tmp-dir\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.244352 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-config-volume\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.244451 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-trusted-ca\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.244451 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a171e5-c09c-4c2d-ad77-c2c5a6027985-config\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.244451 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91dab5b-ef33-493b-9cc9-99941410ef37-serving-cert\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.244685 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.244662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-config\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.245032 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.245008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a171e5-c09c-4c2d-ad77-c2c5a6027985-config\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.245164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.245142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91dab5b-ef33-493b-9cc9-99941410ef37-trusted-ca\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.246838 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.246818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91dab5b-ef33-493b-9cc9-99941410ef37-serving-cert\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.247064 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.247041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a171e5-c09c-4c2d-ad77-c2c5a6027985-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.248719 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.248701 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" Apr 21 00:03:32.252681 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.252660 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lpj6\" (UniqueName: \"kubernetes.io/projected/37a171e5-c09c-4c2d-ad77-c2c5a6027985-kube-api-access-7lpj6\") pod \"service-ca-operator-d6fc45fc5-p6kl9\" (UID: \"37a171e5-c09c-4c2d-ad77-c2c5a6027985\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.253229 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.253206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmk49\" (UniqueName: \"kubernetes.io/projected/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-kube-api-access-mmk49\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.253229 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.253223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgn64\" (UniqueName: \"kubernetes.io/projected/e91dab5b-ef33-493b-9cc9-99941410ef37-kube-api-access-tgn64\") pod \"console-operator-9d4b6777b-p9l6p\" (UID: \"e91dab5b-ef33-493b-9cc9-99941410ef37\") " pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.253461 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.253439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnsz9\" (UniqueName: \"kubernetes.io/projected/e28815ec-1f97-4757-b463-8aec1ad6b01e-kube-api-access-vnsz9\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.264764 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.264743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:03:32.296018 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.295998 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:32.317967 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.317940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" Apr 21 00:03:32.446919 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.446831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:32.447121 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.446981 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:32.447121 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.447004 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:32.447121 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.447072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.44705323 +0000 UTC m=+35.060424729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:32.548238 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.548192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:32.548438 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.548292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:32.548438 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.548324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:32.548438 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548358 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:32.548438 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548436 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.548416414 +0000 UTC m=+35.161787935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:32.548640 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548455 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:32.548640 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548488 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:32.548640 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548535 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.548515557 +0000 UTC m=+35.161887075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:32.548640 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.548554 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.54854519 +0000 UTC m=+35.161916695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:32.649022 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.648985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:32.649195 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.649134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.649258 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.649214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:03:32.649258 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.649243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:32.649362 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.649346 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:32.649439 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.649414 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 00:03:32.649549 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.649420 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.649402155 +0000 UTC m=+35.262773671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:32.649549 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.649476 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.649460923 +0000 UTC m=+35.262832436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:32.649549 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.649491 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs podName:173d74c8-1f07-4764-a03f-8091e02dc212 nodeName:}" failed. No retries permitted until 2026-04-21 00:04:04.649482365 +0000 UTC m=+66.262853866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs") pod "network-metrics-daemon-6fz9j" (UID: "173d74c8-1f07-4764-a03f-8091e02dc212") : secret "metrics-daemon-secret" not found Apr 21 00:03:32.651968 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.651942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc5d7\" (UniqueName: \"kubernetes.io/projected/a10f7678-f6da-46bb-86eb-c0de2afb421c-kube-api-access-tc5d7\") pod \"network-check-target-s2mw9\" (UID: \"a10f7678-f6da-46bb-86eb-c0de2afb421c\") " pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:32.750745 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.750437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:32.750745 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.750713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:32.750745 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.750600 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:32.751228 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.750828 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.75080476 +0000 UTC m=+35.364176278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:32.751228 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.750892 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:32.751228 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:32.750951 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:33.750935908 +0000 UTC m=+35.364307416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:32.902612 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:32.902572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:33.459482 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.458694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:33.459482 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.458958 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:33.459482 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.458975 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:33.459482 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.459038 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.459018752 +0000 UTC m=+37.072390254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:33.535988 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.535931 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-p9l6p"] Apr 21 00:03:33.538700 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.538673 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4dln"] Apr 21 00:03:33.542044 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.540004 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5"] Apr 21 00:03:33.542044 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.541988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj"] Apr 21 00:03:33.545319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.545296 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9"] Apr 21 00:03:33.555571 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.555549 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw"] Apr 21 00:03:33.558124 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.558063 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s2mw9"] Apr 21 00:03:33.560033 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.560013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:33.560123 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.560071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:33.560163 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.560121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:33.560203 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560181 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:33.560249 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560230 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:33.560249 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560247 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.560231911 +0000 UTC m=+37.173603435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:33.560347 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560268 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:33.560687 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560675 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.560267024 +0000 UTC m=+37.173638530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:33.560724 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.560701 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.56069224 +0000 UTC m=+37.174063739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:33.572360 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.569921 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4"] Apr 21 00:03:33.573250 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.573188 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7"] Apr 21 00:03:33.574413 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.574369 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt"] Apr 21 00:03:33.575125 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.575106 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29612160-ghs4n"] Apr 21 00:03:33.658770 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.658734 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73af315a_43db_4687_aa2e_2555ab2f3d65.slice/crio-e779235132dbf6169a53f1a65dc1c5c3cb9a9c3f0906271cafd061c3f4832b64 WatchSource:0}: Error finding container e779235132dbf6169a53f1a65dc1c5c3cb9a9c3f0906271cafd061c3f4832b64: Status 404 returned error can't find the container with id e779235132dbf6169a53f1a65dc1c5c3cb9a9c3f0906271cafd061c3f4832b64 Apr 21 00:03:33.659570 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.659462 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb84b042_37b8_43f5_94e4_43fb54d2041b.slice/crio-41c078f13e25a5a59bd89282abd0c224ec75f306b795ba76e9c5c9d70c4587f2 WatchSource:0}: Error finding container 41c078f13e25a5a59bd89282abd0c224ec75f306b795ba76e9c5c9d70c4587f2: Status 404 returned error can't find the container with id 41c078f13e25a5a59bd89282abd0c224ec75f306b795ba76e9c5c9d70c4587f2 Apr 21 00:03:33.660265 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.660241 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode91dab5b_ef33_493b_9cc9_99941410ef37.slice/crio-8976e37ecda42f78396ff994a878b61dd30690adec31caea359c4dd358eb7944 WatchSource:0}: Error finding container 8976e37ecda42f78396ff994a878b61dd30690adec31caea359c4dd358eb7944: Status 404 returned error can't find the container with id 8976e37ecda42f78396ff994a878b61dd30690adec31caea359c4dd358eb7944 Apr 21 00:03:33.660742 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.660654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:33.660867 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.660782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:33.660867 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.660814 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:33.660972 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.660878 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.660857042 +0000 UTC m=+37.274228557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:33.660972 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.660955 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.660936121 +0000 UTC m=+37.274307621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:33.661374 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.661354 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ad4b12_b3eb_4705_9146_fc282c21c226.slice/crio-1c43476e5cc2842e6f3f56d42b1adba9c19620b145c4229892692b1d40e99d7a WatchSource:0}: Error finding container 1c43476e5cc2842e6f3f56d42b1adba9c19620b145c4229892692b1d40e99d7a: Status 404 returned error can't find the container with id 1c43476e5cc2842e6f3f56d42b1adba9c19620b145c4229892692b1d40e99d7a Apr 21 00:03:33.667637 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.667608 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda10f7678_f6da_46bb_86eb_c0de2afb421c.slice/crio-b534a59e23b608a208353d4e6f4bc70b002487d678ffa2ac1d082f6c49cab4f6 WatchSource:0}: Error finding container b534a59e23b608a208353d4e6f4bc70b002487d678ffa2ac1d082f6c49cab4f6: Status 404 returned error can't find the container with id b534a59e23b608a208353d4e6f4bc70b002487d678ffa2ac1d082f6c49cab4f6 Apr 21 00:03:33.667763 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.667741 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f4bd8a_7dae_45c0_9b1e_a5c145a09876.slice/crio-a3a5c7427d78d519c00cdac9ba35d9e6e33dcfba7e12805a2d4f10bb7022f0ee WatchSource:0}: Error finding container a3a5c7427d78d519c00cdac9ba35d9e6e33dcfba7e12805a2d4f10bb7022f0ee: Status 404 returned error can't find the container with id a3a5c7427d78d519c00cdac9ba35d9e6e33dcfba7e12805a2d4f10bb7022f0ee Apr 21 00:03:33.669806 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.669785 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010e4dff_e2ae_4168_aa42_10e2537edc3c.slice/crio-7bc4333a45d6398eb813c7625f1219efef736265b4415efc9944348e5d609331 WatchSource:0}: Error finding container 7bc4333a45d6398eb813c7625f1219efef736265b4415efc9944348e5d609331: Status 404 returned error can't find the container with id 7bc4333a45d6398eb813c7625f1219efef736265b4415efc9944348e5d609331 Apr 21 00:03:33.671131 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.670876 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d04f98f_58fd_479e_8553_0b49ee6f1c58.slice/crio-7f19b208d9aabae79705b58b4a2df88fc90694d8a02c095b603f72c0d5067fcf WatchSource:0}: Error finding container 7f19b208d9aabae79705b58b4a2df88fc90694d8a02c095b603f72c0d5067fcf: Status 404 returned error can't find the container with id 7f19b208d9aabae79705b58b4a2df88fc90694d8a02c095b603f72c0d5067fcf Apr 21 00:03:33.671987 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:33.671965 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c729e77_783c_4ed2_831a_538689f33279.slice/crio-df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed WatchSource:0}: Error finding container df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed: Status 404 returned error can't find the container with id df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed Apr 21 00:03:33.761411 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.761389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:33.761691 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.761535 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:33.761691 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:33.761556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:33.761691 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.761585 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.761568067 +0000 UTC m=+37.374939565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:33.761691 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.761629 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:33.761691 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:33.761670 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:35.761656625 +0000 UTC m=+37.375028136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:34.144549 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.144381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29612160-ghs4n" event={"ID":"8c729e77-783c-4ed2-831a-538689f33279","Type":"ContainerStarted","Data":"45de2ef2562e717b7d0d7d8888c6c1d46c2c5fc7780fcdfc75c4bb0d07da8f78"} Apr 21 00:03:34.144712 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.144557 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29612160-ghs4n" event={"ID":"8c729e77-783c-4ed2-831a-538689f33279","Type":"ContainerStarted","Data":"df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed"} Apr 21 00:03:34.145457 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.145432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s2mw9" event={"ID":"a10f7678-f6da-46bb-86eb-c0de2afb421c","Type":"ContainerStarted","Data":"b534a59e23b608a208353d4e6f4bc70b002487d678ffa2ac1d082f6c49cab4f6"} Apr 21 00:03:34.146382 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.146354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerStarted","Data":"1c43476e5cc2842e6f3f56d42b1adba9c19620b145c4229892692b1d40e99d7a"} Apr 21 00:03:34.147264 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.147241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" event={"ID":"37a171e5-c09c-4c2d-ad77-c2c5a6027985","Type":"ContainerStarted","Data":"4e0768150b562217238c6a7e77c3e80171f03d2705907395502a69c2647b6548"} Apr 21 00:03:34.148131 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.148112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" event={"ID":"fb84b042-37b8-43f5-94e4-43fb54d2041b","Type":"ContainerStarted","Data":"41c078f13e25a5a59bd89282abd0c224ec75f306b795ba76e9c5c9d70c4587f2"} Apr 21 00:03:34.149170 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.149152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" event={"ID":"a2f4bd8a-7dae-45c0-9b1e-a5c145a09876","Type":"ContainerStarted","Data":"a3a5c7427d78d519c00cdac9ba35d9e6e33dcfba7e12805a2d4f10bb7022f0ee"} Apr 21 00:03:34.150048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.150030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" event={"ID":"e91dab5b-ef33-493b-9cc9-99941410ef37","Type":"ContainerStarted","Data":"8976e37ecda42f78396ff994a878b61dd30690adec31caea359c4dd358eb7944"} Apr 21 00:03:34.152124 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.152079 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="c0ccdeb0f87380fad82288f91456002ae1762642c5ca4bbb114086ccf4a989d9" exitCode=0 Apr 21 00:03:34.152204 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.152153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"c0ccdeb0f87380fad82288f91456002ae1762642c5ca4bbb114086ccf4a989d9"} Apr 21 00:03:34.153042 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.153025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" event={"ID":"010e4dff-e2ae-4168-aa42-10e2537edc3c","Type":"ContainerStarted","Data":"7bc4333a45d6398eb813c7625f1219efef736265b4415efc9944348e5d609331"} Apr 21 00:03:34.153956 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.153937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" event={"ID":"2d04f98f-58fd-479e-8553-0b49ee6f1c58","Type":"ContainerStarted","Data":"7f19b208d9aabae79705b58b4a2df88fc90694d8a02c095b603f72c0d5067fcf"} Apr 21 00:03:34.154915 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.154890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" event={"ID":"7060df7d-1449-4026-9723-09376f46de81","Type":"ContainerStarted","Data":"150931bb056ec0a1f66448a10bca4315bc6af8684a7ffd458c660e1989ffb2ea"} Apr 21 00:03:34.155802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.155785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4dln" event={"ID":"73af315a-43db-4687-aa2e-2555ab2f3d65","Type":"ContainerStarted","Data":"e779235132dbf6169a53f1a65dc1c5c3cb9a9c3f0906271cafd061c3f4832b64"} Apr 21 00:03:34.160076 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:34.160043 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29612160-ghs4n" podStartSLOduration=214.160028793 podStartE2EDuration="3m34.160028793s" podCreationTimestamp="2026-04-21 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:03:34.159295483 +0000 UTC m=+35.772667006" watchObservedRunningTime="2026-04-21 00:03:34.160028793 +0000 UTC m=+35.773400314" Apr 21 00:03:35.178819 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.178770 2571 generic.go:358] "Generic (PLEG): container finished" podID="f398c142-f284-48f9-b608-6eb7229425ae" containerID="b97be1eb1ecba41c2f8a401660cc432e6825a0da944283d81abd6813b22e097e" exitCode=0 Apr 21 00:03:35.180606 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.178920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerDied","Data":"b97be1eb1ecba41c2f8a401660cc432e6825a0da944283d81abd6813b22e097e"} Apr 21 00:03:35.481307 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.481228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:35.481767 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.481560 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:35.481767 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.481582 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:35.481767 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.481694 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.481675164 +0000 UTC m=+41.095046679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.582352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.588119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.588204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588308 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588337 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588401 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.588378895 +0000 UTC m=+41.201750414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588417 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:35.588436 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588421 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.588410367 +0000 UTC m=+41.201781866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:35.588953 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.588456 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.588445354 +0000 UTC m=+41.201816868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:35.690637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.689684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:35.690637 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.689785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:35.690637 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.689949 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.68992964 +0000 UTC m=+41.303301145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:35.690637 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.690406 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:35.690637 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.690459 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.690442616 +0000 UTC m=+41.303814134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:35.791014 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.790972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:35.791231 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:35.791212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:35.791369 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.791352 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:35.791443 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.791419 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.791400741 +0000 UTC m=+41.404772245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:35.791861 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.791840 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:35.791944 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:35.791896 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:39.791880278 +0000 UTC m=+41.405251781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:36.221496 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:36.221395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mghn5" event={"ID":"f398c142-f284-48f9-b608-6eb7229425ae","Type":"ContainerStarted","Data":"3cc286e4cdc3868f16f66a46944bc87b88533faa7adb584c7f923b46552af607"} Apr 21 00:03:36.245712 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:36.245656 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mghn5" podStartSLOduration=5.379389274 podStartE2EDuration="37.245635582s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:01.857723542 +0000 UTC m=+3.471095044" lastFinishedPulling="2026-04-21 00:03:33.72396985 +0000 UTC m=+35.337341352" observedRunningTime="2026-04-21 00:03:36.243291728 +0000 UTC m=+37.856663247" watchObservedRunningTime="2026-04-21 00:03:36.245635582 +0000 UTC m=+37.859007111" Apr 21 00:03:38.723655 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:38.723618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:38.728600 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:38.728542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4c3f54e-8135-4a92-b7dc-1bef279e0201-original-pull-secret\") pod \"global-pull-secret-syncer-8n24l\" (UID: \"d4c3f54e-8135-4a92-b7dc-1bef279e0201\") " pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:38.910762 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:38.910716 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8n24l" Apr 21 00:03:39.531617 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.531570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:39.531798 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.531729 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:39.531798 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.531749 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:39.531900 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.531812 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.531797933 +0000 UTC m=+49.145169436 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:39.632882 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.632840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:39.633070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.632940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:39.633070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.632971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:39.633070 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633014 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:39.633257 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633120 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:39.633257 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633128 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:39.633257 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633124 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.633085177 +0000 UTC m=+49.246456699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:39.633257 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633188 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.633172501 +0000 UTC m=+49.246544002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:39.633257 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.633210 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.633200595 +0000 UTC m=+49.246572097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:39.733980 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.733944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:39.734348 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.734021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:39.734348 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.734083 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:39.734348 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.734160 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.734145857 +0000 UTC m=+49.347517356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:39.734348 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.734173 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.734167341 +0000 UTC m=+49.347538839 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:39.834498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.834421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:39.834498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:39.834481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:39.834718 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.834595 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:39.834718 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.834652 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:39.834718 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.834670 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.834649427 +0000 UTC m=+49.448020969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:39.834718 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:39.834711 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:03:47.834694017 +0000 UTC m=+49.448065529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:46.112367 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:46.112337 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8n24l"] Apr 21 00:03:46.207605 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:03:46.207537 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4c3f54e_8135_4a92_b7dc_1bef279e0201.slice/crio-cbf83aa7f2ce6298dc69cbdabf5d8d4719e2479028ab0851e1bfafd9173458a2 WatchSource:0}: Error finding container cbf83aa7f2ce6298dc69cbdabf5d8d4719e2479028ab0851e1bfafd9173458a2: Status 404 returned error can't find the container with id cbf83aa7f2ce6298dc69cbdabf5d8d4719e2479028ab0851e1bfafd9173458a2 Apr 21 00:03:46.247619 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:46.247537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8n24l" event={"ID":"d4c3f54e-8135-4a92-b7dc-1bef279e0201","Type":"ContainerStarted","Data":"cbf83aa7f2ce6298dc69cbdabf5d8d4719e2479028ab0851e1bfafd9173458a2"} Apr 21 00:03:47.252627 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.252566 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s2mw9" event={"ID":"a10f7678-f6da-46bb-86eb-c0de2afb421c","Type":"ContainerStarted","Data":"aeb5c72b5fb52be7212a86d56941c43dc69da5c8d36bf0cea7279fc2d2f2c4ee"} Apr 21 00:03:47.253180 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.253146 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:03:47.255192 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.255159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerStarted","Data":"6c950220052e6b96c2c64b5c8953bbfb74a79897e8d16d2d230a5d45dbbdce72"} Apr 21 00:03:47.257116 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.257009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" event={"ID":"37a171e5-c09c-4c2d-ad77-c2c5a6027985","Type":"ContainerStarted","Data":"1f4014c7666072817502106d9e314d3870426b9ba2ae6e2e2c4b02138ca55e6c"} Apr 21 00:03:47.259036 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.258746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" event={"ID":"fb84b042-37b8-43f5-94e4-43fb54d2041b","Type":"ContainerStarted","Data":"b6f6c33b3cd7886cc91614fec496d9ed215ab59b2f786d4f9c84896341f9a2a4"} Apr 21 00:03:47.260952 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.260558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" event={"ID":"a2f4bd8a-7dae-45c0-9b1e-a5c145a09876","Type":"ContainerStarted","Data":"a6bce052793552ef93546f4f63736183375deacd49061f58a071b2a55d67066f"} Apr 21 00:03:47.262328 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.262308 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/0.log" Apr 21 00:03:47.262426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.262345 2571 generic.go:358] "Generic (PLEG): container finished" podID="e91dab5b-ef33-493b-9cc9-99941410ef37" containerID="51f6e26bccb279d4dfc6005d62516d41e9bef4b234a883d72937f4b0b4d01348" exitCode=255 Apr 21 00:03:47.262557 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.262527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" event={"ID":"e91dab5b-ef33-493b-9cc9-99941410ef37","Type":"ContainerDied","Data":"51f6e26bccb279d4dfc6005d62516d41e9bef4b234a883d72937f4b0b4d01348"} Apr 21 00:03:47.262725 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.262710 2571 scope.go:117] "RemoveContainer" containerID="51f6e26bccb279d4dfc6005d62516d41e9bef4b234a883d72937f4b0b4d01348" Apr 21 00:03:47.268491 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.268022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" event={"ID":"010e4dff-e2ae-4168-aa42-10e2537edc3c","Type":"ContainerStarted","Data":"221640c50e0ff7690b25c37ed1eb3b6594e193549809320aab6fc4f8bfc58332"} Apr 21 00:03:47.268993 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.268950 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s2mw9" podStartSLOduration=35.932166761 podStartE2EDuration="48.268937591s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.670719311 +0000 UTC m=+35.284090812" lastFinishedPulling="2026-04-21 00:03:46.007490143 +0000 UTC m=+47.620861642" observedRunningTime="2026-04-21 00:03:47.266795997 +0000 UTC m=+48.880167519" watchObservedRunningTime="2026-04-21 00:03:47.268937591 +0000 UTC m=+48.882309111" Apr 21 00:03:47.270916 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.270884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" event={"ID":"2d04f98f-58fd-479e-8553-0b49ee6f1c58","Type":"ContainerStarted","Data":"8dc964356f9c10429a19bf6f01f533e5acfaebfbdc46f19166b5113ceed613bf"} Apr 21 00:03:47.272694 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.272671 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" event={"ID":"7060df7d-1449-4026-9723-09376f46de81","Type":"ContainerStarted","Data":"e61779f326bab628383fbdd1adf97f6f4fa813081704e95ab5e7fafd42a970c3"} Apr 21 00:03:47.273162 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.273136 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:47.275262 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.275244 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" Apr 21 00:03:47.275619 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.275596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4dln" event={"ID":"73af315a-43db-4687-aa2e-2555ab2f3d65","Type":"ContainerStarted","Data":"eb49b58049035fe27bafebe26b463c0d1fa38abba4d8dbf5d39bcb9a287471cf"} Apr 21 00:03:47.298114 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.297515 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d4kjt" podStartSLOduration=5.979605018 podStartE2EDuration="18.297498625s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.671168689 +0000 UTC m=+35.284540191" lastFinishedPulling="2026-04-21 00:03:45.989062297 +0000 UTC m=+47.602433798" observedRunningTime="2026-04-21 00:03:47.28149491 +0000 UTC m=+48.894866431" watchObservedRunningTime="2026-04-21 00:03:47.297498625 +0000 UTC m=+48.910870147" Apr 21 00:03:47.299424 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.298777 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jtsx5" podStartSLOduration=5.725590884 podStartE2EDuration="18.298766007s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.661740225 +0000 UTC m=+35.275111728" lastFinishedPulling="2026-04-21 00:03:46.234915337 +0000 UTC m=+47.848286851" observedRunningTime="2026-04-21 00:03:47.296874192 +0000 UTC m=+48.910245715" watchObservedRunningTime="2026-04-21 00:03:47.298766007 +0000 UTC m=+48.912137529" Apr 21 00:03:47.336227 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.335217 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" podStartSLOduration=5.996637305 podStartE2EDuration="18.335198934s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.668915758 +0000 UTC m=+35.282287263" lastFinishedPulling="2026-04-21 00:03:46.00747739 +0000 UTC m=+47.620848892" observedRunningTime="2026-04-21 00:03:47.334990388 +0000 UTC m=+48.948361912" watchObservedRunningTime="2026-04-21 00:03:47.335198934 +0000 UTC m=+48.948570454" Apr 21 00:03:47.370445 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.369743 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788cb5b5d9-259j7" podStartSLOduration=7.815724037 podStartE2EDuration="20.369730533s" podCreationTimestamp="2026-04-21 00:03:27 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.70088265 +0000 UTC m=+35.314254153" lastFinishedPulling="2026-04-21 00:03:46.25488915 +0000 UTC m=+47.868260649" observedRunningTime="2026-04-21 00:03:47.368626452 +0000 UTC m=+48.981997971" watchObservedRunningTime="2026-04-21 00:03:47.369730533 +0000 UTC m=+48.983102055" Apr 21 00:03:47.370445 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.370163 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" podStartSLOduration=6.081989314 podStartE2EDuration="18.370155488s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.700856915 +0000 UTC m=+35.314228413" lastFinishedPulling="2026-04-21 00:03:45.989023073 +0000 UTC m=+47.602394587" observedRunningTime="2026-04-21 00:03:47.351266849 +0000 UTC m=+48.964638372" watchObservedRunningTime="2026-04-21 00:03:47.370155488 +0000 UTC m=+48.983527010" Apr 21 00:03:47.384565 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.384380 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77784c6fb8-jc2kw" podStartSLOduration=8.045163396 podStartE2EDuration="20.384364785s" podCreationTimestamp="2026-04-21 00:03:27 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.668640583 +0000 UTC m=+35.282012099" lastFinishedPulling="2026-04-21 00:03:46.007841973 +0000 UTC m=+47.621213488" observedRunningTime="2026-04-21 00:03:47.383850397 +0000 UTC m=+48.997221925" watchObservedRunningTime="2026-04-21 00:03:47.384364785 +0000 UTC m=+48.997736307" Apr 21 00:03:47.611577 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.611311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:03:47.611577 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.611489 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 00:03:47.611577 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.611514 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56998694b4-5kqfw: secret "image-registry-tls" not found Apr 21 00:03:47.611837 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.611583 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls podName:b4420432-cd86-4b5b-a350-f40c3c3cb85b nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.61156202 +0000 UTC m=+65.224933525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls") pod "image-registry-56998694b4-5kqfw" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b") : secret "image-registry-tls" not found Apr 21 00:03:47.712311 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.712259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:03:47.712487 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.712329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:03:47.712487 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712429 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:47.712487 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712433 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 00:03:47.712487 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.712479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:03:47.712695 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712500 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls podName:4854cd30-d0eb-4603-8e32-c7919b625f6c nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.712478195 +0000 UTC m=+65.325849719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cn6dq" (UID: "4854cd30-d0eb-4603-8e32-c7919b625f6c") : secret "cluster-monitoring-operator-tls" not found Apr 21 00:03:47.712695 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712542 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 00:03:47.712695 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712593 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert podName:8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.712572508 +0000 UTC m=+65.325944011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m2p9z" (UID: "8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e") : secret "networking-console-plugin-cert" not found Apr 21 00:03:47.712695 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.712616 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls podName:f81ce5f8-4def-44ee-ae07-4b37a1fbe3be nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.712606104 +0000 UTC m=+65.325977609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2l2dr" (UID: "f81ce5f8-4def-44ee-ae07-4b37a1fbe3be") : secret "samples-operator-tls" not found Apr 21 00:03:47.815117 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.814556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:47.815117 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.814703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:03:47.815117 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.814992 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 00:03:47.816421 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.815572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.815551055 +0000 UTC m=+65.428922566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : configmap references non-existent config key: service-ca.crt Apr 21 00:03:47.816823 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.816801 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs podName:e4103e8a-c222-4304-953c-43f57e73acef nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.816777192 +0000 UTC m=+65.430148706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs") pod "router-default-69675cd558-ml7nv" (UID: "e4103e8a-c222-4304-953c-43f57e73acef") : secret "router-metrics-certs-default" not found Apr 21 00:03:47.917526 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.917402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:03:47.917712 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.917553 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 00:03:47.920558 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.917816 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls podName:5dca0c9f-ca96-4377-bb4d-280b9c469ca1 nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.917790305 +0000 UTC m=+65.531161807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls") pod "dns-default-nxkjd" (UID: "5dca0c9f-ca96-4377-bb4d-280b9c469ca1") : secret "dns-default-metrics-tls" not found Apr 21 00:03:47.920558 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:47.917862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:03:47.920558 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.917977 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 00:03:47.920558 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:47.918009 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert podName:e28815ec-1f97-4757-b463-8aec1ad6b01e nodeName:}" failed. No retries permitted until 2026-04-21 00:04:03.917998919 +0000 UTC m=+65.531370424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert") pod "ingress-canary-6w5rw" (UID: "e28815ec-1f97-4757-b463-8aec1ad6b01e") : secret "canary-serving-cert" not found Apr 21 00:03:48.288351 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.288323 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:03:48.289317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.289295 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/0.log" Apr 21 00:03:48.289443 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.289338 2571 generic.go:358] "Generic (PLEG): container finished" podID="e91dab5b-ef33-493b-9cc9-99941410ef37" containerID="9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed" exitCode=255 Apr 21 00:03:48.289527 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.289489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" event={"ID":"e91dab5b-ef33-493b-9cc9-99941410ef37","Type":"ContainerDied","Data":"9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed"} Apr 21 00:03:48.289589 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.289536 2571 scope.go:117] "RemoveContainer" containerID="51f6e26bccb279d4dfc6005d62516d41e9bef4b234a883d72937f4b0b4d01348" Apr 21 00:03:48.289733 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.289705 2571 scope.go:117] "RemoveContainer" containerID="9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed" Apr 21 00:03:48.290000 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:48.289954 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p9l6p_openshift-console-operator(e91dab5b-ef33-493b-9cc9-99941410ef37)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" podUID="e91dab5b-ef33-493b-9cc9-99941410ef37" Apr 21 00:03:48.307259 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:48.307210 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-b4dln" podStartSLOduration=6.96105096 podStartE2EDuration="19.30719835s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.661123255 +0000 UTC m=+35.274494755" lastFinishedPulling="2026-04-21 00:03:46.007270638 +0000 UTC m=+47.620642145" observedRunningTime="2026-04-21 00:03:47.400736891 +0000 UTC m=+49.014108413" watchObservedRunningTime="2026-04-21 00:03:48.30719835 +0000 UTC m=+49.920569871" Apr 21 00:03:49.293493 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:49.293463 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:03:49.293928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:49.293841 2571 scope.go:117] "RemoveContainer" containerID="9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed" Apr 21 00:03:49.294029 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:49.294011 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p9l6p_openshift-console-operator(e91dab5b-ef33-493b-9cc9-99941410ef37)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" podUID="e91dab5b-ef33-493b-9cc9-99941410ef37" Apr 21 00:03:50.134516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:50.134486 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gwszj_918f7c2d-8780-4291-b141-5fb77d94b6cf/dns-node-resolver/0.log" Apr 21 00:03:50.734555 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:50.734523 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29612160-ghs4n_8c729e77-783c-4ed2-831a-538689f33279/image-pruner/0.log" Apr 21 00:03:51.301802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:51.301773 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerStarted","Data":"57c75d4400d3f777294899b321d1c39c3a1e4155458868d3e493c7ae8c97ccd5"} Apr 21 00:03:51.536189 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:51.536163 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9hvww_dbf6c72e-bf6b-47f7-bac9-1b24d4a37975/node-ca/0.log" Apr 21 00:03:52.296325 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.296292 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:52.296805 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.296333 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:03:52.296805 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.296783 2571 scope.go:117] "RemoveContainer" containerID="9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed" Apr 21 00:03:52.297016 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:03:52.296994 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-p9l6p_openshift-console-operator(e91dab5b-ef33-493b-9cc9-99941410ef37)\"" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" podUID="e91dab5b-ef33-493b-9cc9-99941410ef37" Apr 21 00:03:52.306313 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.306284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8n24l" event={"ID":"d4c3f54e-8135-4a92-b7dc-1bef279e0201","Type":"ContainerStarted","Data":"c755dfa1ca1c35e35f4c9be23cff86616687ffe58b74a4648d139ead41044960"} Apr 21 00:03:52.308162 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.308142 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerStarted","Data":"e4cc8457aa83530cb47a6f67dd8bb334477a7487fa68d07f990c3502b7fca0ef"} Apr 21 00:03:52.322745 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.322706 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8n24l" podStartSLOduration=41.372611371 podStartE2EDuration="46.322695738s" podCreationTimestamp="2026-04-21 00:03:06 +0000 UTC" firstStartedPulling="2026-04-21 00:03:46.234879286 +0000 UTC m=+47.848250800" lastFinishedPulling="2026-04-21 00:03:51.184963664 +0000 UTC m=+52.798335167" observedRunningTime="2026-04-21 00:03:52.322399771 +0000 UTC m=+53.935771293" watchObservedRunningTime="2026-04-21 00:03:52.322695738 +0000 UTC m=+53.936067258" Apr 21 00:03:52.339919 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.339868 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" podStartSLOduration=7.833875737 podStartE2EDuration="25.339855983s" podCreationTimestamp="2026-04-21 00:03:27 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.666744655 +0000 UTC m=+35.280116164" lastFinishedPulling="2026-04-21 00:03:51.17272491 +0000 UTC m=+52.786096410" observedRunningTime="2026-04-21 00:03:52.339431569 +0000 UTC m=+53.952803105" watchObservedRunningTime="2026-04-21 00:03:52.339855983 +0000 UTC m=+53.953227504" Apr 21 00:03:52.937143 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:52.937084 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-m59z4_010e4dff-e2ae-4168-aa42-10e2537edc3c/kube-storage-version-migrator-operator/0.log" Apr 21 00:03:58.152487 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:03:58.152457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2nrh8" Apr 21 00:04:03.666289 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.666247 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:03.668582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.668559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"image-registry-56998694b4-5kqfw\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:03.767028 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.766987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:04:03.767245 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.767067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:04:03.767245 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.767116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:04:03.769422 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.769389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f81ce5f8-4def-44ee-ae07-4b37a1fbe3be-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2l2dr\" (UID: \"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:04:03.769540 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.769464 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4854cd30-d0eb-4603-8e32-c7919b625f6c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cn6dq\" (UID: \"4854cd30-d0eb-4603-8e32-c7919b625f6c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:04:03.792299 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.792275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m2p9z\" (UID: \"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:04:03.837923 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.837902 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pctqm\"" Apr 21 00:04:03.845820 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.845805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:03.867832 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.867802 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:03.867975 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.867927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:03.868496 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.868475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4103e8a-c222-4304-953c-43f57e73acef-service-ca-bundle\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:03.869875 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.869851 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cznvh\"" Apr 21 00:04:03.870162 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.870145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4103e8a-c222-4304-953c-43f57e73acef-metrics-certs\") pod \"router-default-69675cd558-ml7nv\" (UID: \"e4103e8a-c222-4304-953c-43f57e73acef\") " pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:03.877656 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.877640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" Apr 21 00:04:03.917066 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.916935 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fjh5b\"" Apr 21 00:04:03.925699 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.925296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" Apr 21 00:04:03.931317 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.931293 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xljqt\"" Apr 21 00:04:03.940900 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.939572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" Apr 21 00:04:03.966360 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.966335 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-h76vq\"" Apr 21 00:04:03.968583 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.968554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:04:03.968690 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.968665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:04:03.974208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.973614 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:04:03.975847 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.974379 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:03.975847 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.975493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e28815ec-1f97-4757-b463-8aec1ad6b01e-cert\") pod \"ingress-canary-6w5rw\" (UID: \"e28815ec-1f97-4757-b463-8aec1ad6b01e\") " pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:04:03.976591 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:03.975940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dca0c9f-ca96-4377-bb4d-280b9c469ca1-metrics-tls\") pod \"dns-default-nxkjd\" (UID: \"5dca0c9f-ca96-4377-bb4d-280b9c469ca1\") " pod="openshift-dns/dns-default-nxkjd" Apr 21 00:04:03.976684 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:03.976634 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4420432_cd86_4b5b_a350_f40c3c3cb85b.slice/crio-62d6a7e2c4619b7c9ee6d7f448e917f58fecbccd41753c6ec3a46fcf2d52cfdc WatchSource:0}: Error finding container 62d6a7e2c4619b7c9ee6d7f448e917f58fecbccd41753c6ec3a46fcf2d52cfdc: Status 404 returned error can't find the container with id 62d6a7e2c4619b7c9ee6d7f448e917f58fecbccd41753c6ec3a46fcf2d52cfdc Apr 21 00:04:04.037936 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.037450 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr"] Apr 21 00:04:04.096916 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.096883 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z"] Apr 21 00:04:04.101502 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:04.101246 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ebef9e6_96c0_4de0_8ad0_70daf0b29f1e.slice/crio-5c1d66c6488d6a14222b75fa7a5933706f3c3ec2c193557551a227b6a39be5fc WatchSource:0}: Error finding container 5c1d66c6488d6a14222b75fa7a5933706f3c3ec2c193557551a227b6a39be5fc: Status 404 returned error can't find the container with id 5c1d66c6488d6a14222b75fa7a5933706f3c3ec2c193557551a227b6a39be5fc Apr 21 00:04:04.115179 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.115108 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq"] Apr 21 00:04:04.131817 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.131795 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-79cql\"" Apr 21 00:04:04.139884 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.139836 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nxkjd" Apr 21 00:04:04.139961 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.139878 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2glpj\"" Apr 21 00:04:04.148230 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.148209 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6w5rw" Apr 21 00:04:04.153937 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.153880 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69675cd558-ml7nv"] Apr 21 00:04:04.172902 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:04.172802 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4103e8a_c222_4304_953c_43f57e73acef.slice/crio-a17898257dc388c9eadc75ba63b5cdca2ba4ceb0bf241044ed62e27dbcaf18fe WatchSource:0}: Error finding container a17898257dc388c9eadc75ba63b5cdca2ba4ceb0bf241044ed62e27dbcaf18fe: Status 404 returned error can't find the container with id a17898257dc388c9eadc75ba63b5cdca2ba4ceb0bf241044ed62e27dbcaf18fe Apr 21 00:04:04.297183 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.297152 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nxkjd"] Apr 21 00:04:04.301308 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:04.301281 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dca0c9f_ca96_4377_bb4d_280b9c469ca1.slice/crio-7a6f7ae8b6dc0ee34804df745b4cf574b6c2c1cb80e0e643819c7e20886370a1 WatchSource:0}: Error finding container 7a6f7ae8b6dc0ee34804df745b4cf574b6c2c1cb80e0e643819c7e20886370a1: Status 404 returned error can't find the container with id 7a6f7ae8b6dc0ee34804df745b4cf574b6c2c1cb80e0e643819c7e20886370a1 Apr 21 00:04:04.318422 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.318391 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6w5rw"] Apr 21 00:04:04.320932 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:04.320902 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28815ec_1f97_4757_b463_8aec1ad6b01e.slice/crio-5a4b9f9aa868f0591eb60cf203ee1ff2039a824c3005e27bea81ee3596a265b4 WatchSource:0}: Error finding container 5a4b9f9aa868f0591eb60cf203ee1ff2039a824c3005e27bea81ee3596a265b4: Status 404 returned error can't find the container with id 5a4b9f9aa868f0591eb60cf203ee1ff2039a824c3005e27bea81ee3596a265b4 Apr 21 00:04:04.342924 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.342889 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" event={"ID":"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be","Type":"ContainerStarted","Data":"63c3eaa45b7d2abc4402a6475aff2ce87dffd72ee4d42741ebe43e1cd3616979"} Apr 21 00:04:04.344419 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.344389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69675cd558-ml7nv" event={"ID":"e4103e8a-c222-4304-953c-43f57e73acef","Type":"ContainerStarted","Data":"0b3d2d427c8aa7e306827d90d268ce103edefde54c373e397af20a9e403f7873"} Apr 21 00:04:04.344542 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.344429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69675cd558-ml7nv" event={"ID":"e4103e8a-c222-4304-953c-43f57e73acef","Type":"ContainerStarted","Data":"a17898257dc388c9eadc75ba63b5cdca2ba4ceb0bf241044ed62e27dbcaf18fe"} Apr 21 00:04:04.345520 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.345500 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" event={"ID":"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e","Type":"ContainerStarted","Data":"5c1d66c6488d6a14222b75fa7a5933706f3c3ec2c193557551a227b6a39be5fc"} Apr 21 00:04:04.346472 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.346453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6w5rw" event={"ID":"e28815ec-1f97-4757-b463-8aec1ad6b01e","Type":"ContainerStarted","Data":"5a4b9f9aa868f0591eb60cf203ee1ff2039a824c3005e27bea81ee3596a265b4"} Apr 21 00:04:04.347887 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.347862 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" event={"ID":"b4420432-cd86-4b5b-a350-f40c3c3cb85b","Type":"ContainerStarted","Data":"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4"} Apr 21 00:04:04.347988 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.347894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" event={"ID":"b4420432-cd86-4b5b-a350-f40c3c3cb85b","Type":"ContainerStarted","Data":"62d6a7e2c4619b7c9ee6d7f448e917f58fecbccd41753c6ec3a46fcf2d52cfdc"} Apr 21 00:04:04.348067 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.347987 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:04.348871 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.348853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxkjd" event={"ID":"5dca0c9f-ca96-4377-bb4d-280b9c469ca1","Type":"ContainerStarted","Data":"7a6f7ae8b6dc0ee34804df745b4cf574b6c2c1cb80e0e643819c7e20886370a1"} Apr 21 00:04:04.349910 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.349893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" event={"ID":"4854cd30-d0eb-4603-8e32-c7919b625f6c","Type":"ContainerStarted","Data":"e3c6aaa6ea18fdf04d17288b40ed378cbca694c0d22c5ee243c81b823b582884"} Apr 21 00:04:04.363548 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.363505 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69675cd558-ml7nv" podStartSLOduration=35.363493808 podStartE2EDuration="35.363493808s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:04:04.36280148 +0000 UTC m=+65.976173018" watchObservedRunningTime="2026-04-21 00:04:04.363493808 +0000 UTC m=+65.976865325" Apr 21 00:04:04.381230 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.381186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" podStartSLOduration=65.381173195 podStartE2EDuration="1m5.381173195s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:04:04.380001993 +0000 UTC m=+65.993373549" watchObservedRunningTime="2026-04-21 00:04:04.381173195 +0000 UTC m=+65.994544710" Apr 21 00:04:04.679034 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.679000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:04:04.681250 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.681231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173d74c8-1f07-4764-a03f-8091e02dc212-metrics-certs\") pod \"network-metrics-daemon-6fz9j\" (UID: \"173d74c8-1f07-4764-a03f-8091e02dc212\") " pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:04:04.726045 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.726006 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-n52qc\"" Apr 21 00:04:04.734910 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.734890 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6fz9j" Apr 21 00:04:04.887250 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.887195 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6fz9j"] Apr 21 00:04:04.975687 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.975588 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:04.978863 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:04.978649 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:05.355889 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:05.355841 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fz9j" event={"ID":"173d74c8-1f07-4764-a03f-8091e02dc212","Type":"ContainerStarted","Data":"9079ec08e1f3aca74d3a8f285773be9d6d2bc8094bfc049955119096c6455e6f"} Apr 21 00:04:05.356152 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:05.356130 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:05.358140 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:05.357896 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69675cd558-ml7nv" Apr 21 00:04:06.990052 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:06.990025 2571 scope.go:117] "RemoveContainer" containerID="9776e7ef7d9d45e365664c16b5adc4360f0dad72f2df4d7f303e6ea86586a6ed" Apr 21 00:04:08.620064 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.620023 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sb2tk"] Apr 21 00:04:08.638395 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.638370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sb2tk"] Apr 21 00:04:08.638552 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.638514 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.641316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.641274 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pjwht\"" Apr 21 00:04:08.642201 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.642181 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 00:04:08.642326 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.642182 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 00:04:08.817249 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.817218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8z8x\" (UniqueName: \"kubernetes.io/projected/44e13f0c-c185-4594-8599-652d3fba3595-kube-api-access-c8z8x\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.817389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.817254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44e13f0c-c185-4594-8599-652d3fba3595-data-volume\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.817389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.817281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44e13f0c-c185-4594-8599-652d3fba3595-crio-socket\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.817389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.817318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44e13f0c-c185-4594-8599-652d3fba3595-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.817389 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.817352 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44e13f0c-c185-4594-8599-652d3fba3595-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.917838 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.917812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8z8x\" (UniqueName: \"kubernetes.io/projected/44e13f0c-c185-4594-8599-652d3fba3595-kube-api-access-c8z8x\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.917999 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.917844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44e13f0c-c185-4594-8599-652d3fba3595-data-volume\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.917999 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.917865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44e13f0c-c185-4594-8599-652d3fba3595-crio-socket\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.917999 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.917928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44e13f0c-c185-4594-8599-652d3fba3595-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.918202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.918012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44e13f0c-c185-4594-8599-652d3fba3595-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.918202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.917940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44e13f0c-c185-4594-8599-652d3fba3595-crio-socket\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.918202 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.918188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44e13f0c-c185-4594-8599-652d3fba3595-data-volume\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.918753 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.918730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44e13f0c-c185-4594-8599-652d3fba3595-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.920403 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.920379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44e13f0c-c185-4594-8599-652d3fba3595-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.927928 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.927906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8z8x\" (UniqueName: \"kubernetes.io/projected/44e13f0c-c185-4594-8599-652d3fba3595-kube-api-access-c8z8x\") pod \"insights-runtime-extractor-sb2tk\" (UID: \"44e13f0c-c185-4594-8599-652d3fba3595\") " pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:08.947891 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:08.947872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sb2tk" Apr 21 00:04:09.491863 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:09.491825 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sb2tk"] Apr 21 00:04:09.493242 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:09.493213 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e13f0c_c185_4594_8599_652d3fba3595.slice/crio-d4559508b1fa519bf8673bdf85df21266f2b583b9d431e9d87231afaaf94212d WatchSource:0}: Error finding container d4559508b1fa519bf8673bdf85df21266f2b583b9d431e9d87231afaaf94212d: Status 404 returned error can't find the container with id d4559508b1fa519bf8673bdf85df21266f2b583b9d431e9d87231afaaf94212d Apr 21 00:04:10.376570 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.376528 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6w5rw" event={"ID":"e28815ec-1f97-4757-b463-8aec1ad6b01e","Type":"ContainerStarted","Data":"edfe0a29f646257889fad260224d3d280c81042bcaf37dfb087478f3450c2418"} Apr 21 00:04:10.378221 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.378187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxkjd" event={"ID":"5dca0c9f-ca96-4377-bb4d-280b9c469ca1","Type":"ContainerStarted","Data":"4c81a58851e162f57581921c26509125267b1a67017aca9c8c92423ffd933a63"} Apr 21 00:04:10.378221 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.378216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxkjd" event={"ID":"5dca0c9f-ca96-4377-bb4d-280b9c469ca1","Type":"ContainerStarted","Data":"30a0d71df329616d6ab8716e76c8ac888cdd88cea5a2ad965620fef9adefe50c"} Apr 21 00:04:10.378406 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.378270 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nxkjd" Apr 21 00:04:10.379669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.379648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" event={"ID":"4854cd30-d0eb-4603-8e32-c7919b625f6c","Type":"ContainerStarted","Data":"feba1e70e6512ee03c941df8b8e311742f258dfea73f5835ad22756c650be91f"} Apr 21 00:04:10.381205 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.381190 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:04:10.381384 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.381255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" event={"ID":"e91dab5b-ef33-493b-9cc9-99941410ef37","Type":"ContainerStarted","Data":"f6f2860401157948668fbd2b67a20810090f368be9cca2d840eaa1b587276f88"} Apr 21 00:04:10.381537 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.381505 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:04:10.382924 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.382892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fz9j" event={"ID":"173d74c8-1f07-4764-a03f-8091e02dc212","Type":"ContainerStarted","Data":"b499d8248e25ccb05f596ebe4ba1001e03d34dcae843c0c8c78d48188d025e3b"} Apr 21 00:04:10.382924 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.382924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6fz9j" event={"ID":"173d74c8-1f07-4764-a03f-8091e02dc212","Type":"ContainerStarted","Data":"a52d786eac33985134da42334167c9333d152a2bd9287edf8713b169025c7b42"} Apr 21 00:04:10.384228 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.384210 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sb2tk" event={"ID":"44e13f0c-c185-4594-8599-652d3fba3595","Type":"ContainerStarted","Data":"d06b18f0a1411fe5e41f74f9ddf55eb0815df490ad434802c75142276c5cb4d0"} Apr 21 00:04:10.384309 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.384233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sb2tk" event={"ID":"44e13f0c-c185-4594-8599-652d3fba3595","Type":"ContainerStarted","Data":"d4559508b1fa519bf8673bdf85df21266f2b583b9d431e9d87231afaaf94212d"} Apr 21 00:04:10.385718 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.385695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" event={"ID":"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be","Type":"ContainerStarted","Data":"b3c385870d07daa9972bb7ccf6517dfa7bdb94889018736f32510f80ecc6d5a9"} Apr 21 00:04:10.385796 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.385725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" event={"ID":"f81ce5f8-4def-44ee-ae07-4b37a1fbe3be","Type":"ContainerStarted","Data":"5766ea190c716d8566fa874e1c75e50c34cea56ab1c09f76bc8bcb05e8493931"} Apr 21 00:04:10.386842 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.386818 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" Apr 21 00:04:10.386934 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.386922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" event={"ID":"8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e","Type":"ContainerStarted","Data":"0f97bca416dc9dcb5a79d31202a55cdb01c755f4e6a7f44deaae112bbc3d1a60"} Apr 21 00:04:10.391048 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.390978 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6w5rw" podStartSLOduration=34.373837265 podStartE2EDuration="39.390967323s" podCreationTimestamp="2026-04-21 00:03:31 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.325992932 +0000 UTC m=+65.939364434" lastFinishedPulling="2026-04-21 00:04:09.343122993 +0000 UTC m=+70.956494492" observedRunningTime="2026-04-21 00:04:10.390375476 +0000 UTC m=+72.003746997" watchObservedRunningTime="2026-04-21 00:04:10.390967323 +0000 UTC m=+72.004338882" Apr 21 00:04:10.406255 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.406180 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cn6dq" podStartSLOduration=36.184928332 podStartE2EDuration="41.406163251s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.121901247 +0000 UTC m=+65.735272754" lastFinishedPulling="2026-04-21 00:04:09.343136173 +0000 UTC m=+70.956507673" observedRunningTime="2026-04-21 00:04:10.404588135 +0000 UTC m=+72.017959686" watchObservedRunningTime="2026-04-21 00:04:10.406163251 +0000 UTC m=+72.019534774" Apr 21 00:04:10.422139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.422069 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2l2dr" podStartSLOduration=36.201850515 podStartE2EDuration="41.422052211s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.123004494 +0000 UTC m=+65.736375997" lastFinishedPulling="2026-04-21 00:04:09.343206189 +0000 UTC m=+70.956577693" observedRunningTime="2026-04-21 00:04:10.420400786 +0000 UTC m=+72.033772308" watchObservedRunningTime="2026-04-21 00:04:10.422052211 +0000 UTC m=+72.035423733" Apr 21 00:04:10.441056 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.441002 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m2p9z" podStartSLOduration=36.201830153 podStartE2EDuration="41.440988163s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.103966184 +0000 UTC m=+65.717337692" lastFinishedPulling="2026-04-21 00:04:09.343124196 +0000 UTC m=+70.956495702" observedRunningTime="2026-04-21 00:04:10.43959585 +0000 UTC m=+72.052967372" watchObservedRunningTime="2026-04-21 00:04:10.440988163 +0000 UTC m=+72.054359684" Apr 21 00:04:10.456562 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.456510 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nxkjd" podStartSLOduration=34.416921986 podStartE2EDuration="39.456496736s" podCreationTimestamp="2026-04-21 00:03:31 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.303555107 +0000 UTC m=+65.916926606" lastFinishedPulling="2026-04-21 00:04:09.343129849 +0000 UTC m=+70.956501356" observedRunningTime="2026-04-21 00:04:10.455660418 +0000 UTC m=+72.069031944" watchObservedRunningTime="2026-04-21 00:04:10.456496736 +0000 UTC m=+72.069868256" Apr 21 00:04:10.473158 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.473108 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-p9l6p" podStartSLOduration=29.146905828 podStartE2EDuration="41.47307986s" podCreationTimestamp="2026-04-21 00:03:29 +0000 UTC" firstStartedPulling="2026-04-21 00:03:33.662847793 +0000 UTC m=+35.276219296" lastFinishedPulling="2026-04-21 00:03:45.989021812 +0000 UTC m=+47.602393328" observedRunningTime="2026-04-21 00:04:10.471761469 +0000 UTC m=+72.085132991" watchObservedRunningTime="2026-04-21 00:04:10.47307986 +0000 UTC m=+72.086451381" Apr 21 00:04:10.487777 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:10.487723 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6fz9j" podStartSLOduration=66.501358187 podStartE2EDuration="1m11.48770425s" podCreationTimestamp="2026-04-21 00:02:59 +0000 UTC" firstStartedPulling="2026-04-21 00:04:04.896612209 +0000 UTC m=+66.509983726" lastFinishedPulling="2026-04-21 00:04:09.882958277 +0000 UTC m=+71.496329789" observedRunningTime="2026-04-21 00:04:10.485704624 +0000 UTC m=+72.099076144" watchObservedRunningTime="2026-04-21 00:04:10.48770425 +0000 UTC m=+72.101075772" Apr 21 00:04:11.391513 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:11.391463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sb2tk" event={"ID":"44e13f0c-c185-4594-8599-652d3fba3595","Type":"ContainerStarted","Data":"3bcad7966eef544d60abce0f529991218773a12613c73ea5beced216cd880481"} Apr 21 00:04:14.402528 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:14.402493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sb2tk" event={"ID":"44e13f0c-c185-4594-8599-652d3fba3595","Type":"ContainerStarted","Data":"c5a1bd39c7243b4e1210214c413e9ecaf49389ad089eb7bc4dd9282346147da6"} Apr 21 00:04:14.418181 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:14.418138 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sb2tk" podStartSLOduration=2.43233322 podStartE2EDuration="6.418124357s" podCreationTimestamp="2026-04-21 00:04:08 +0000 UTC" firstStartedPulling="2026-04-21 00:04:09.667912313 +0000 UTC m=+71.281283816" lastFinishedPulling="2026-04-21 00:04:13.653703454 +0000 UTC m=+75.267074953" observedRunningTime="2026-04-21 00:04:14.416939436 +0000 UTC m=+76.030310957" watchObservedRunningTime="2026-04-21 00:04:14.418124357 +0000 UTC m=+76.031495878" Apr 21 00:04:19.296205 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:19.296081 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s2mw9" Apr 21 00:04:20.393850 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:20.393817 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nxkjd" Apr 21 00:04:23.410773 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.410737 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hjvz8"] Apr 21 00:04:23.413667 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.413649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.416469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.416443 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 00:04:23.416611 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.416577 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 00:04:23.416681 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.416577 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 00:04:23.417489 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.417469 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 00:04:23.417569 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.417487 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9p48h\"" Apr 21 00:04:23.426556 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglsb\" (UniqueName: \"kubernetes.io/projected/141eeb26-5732-4466-8de8-641334400202-kube-api-access-wglsb\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426574 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-sys\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-root\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-node-exporter-wtmp\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426709 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-metrics-client-ca\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.426844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.426784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/141eeb26-5732-4466-8de8-641334400202-node-exporter-textfile\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527134 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-node-exporter-wtmp\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-metrics-client-ca\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/141eeb26-5732-4466-8de8-641334400202-node-exporter-textfile\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527293 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:04:23.527282 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wglsb\" (UniqueName: \"kubernetes.io/projected/141eeb26-5732-4466-8de8-641334400202-kube-api-access-wglsb\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-sys\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-node-exporter-wtmp\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:04:23.527362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls podName:141eeb26-5732-4466-8de8-641334400202 nodeName:}" failed. No retries permitted until 2026-04-21 00:04:24.027339212 +0000 UTC m=+85.640710731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls") pod "node-exporter-hjvz8" (UID: "141eeb26-5732-4466-8de8-641334400202") : secret "node-exporter-tls" not found Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-sys\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-root\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/141eeb26-5732-4466-8de8-641334400202-root\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.527961 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.527843 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/141eeb26-5732-4466-8de8-641334400202-node-exporter-textfile\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.528070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.528046 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.528144 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.528069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/141eeb26-5732-4466-8de8-641334400202-metrics-client-ca\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.529613 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.529595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.538394 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.538370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglsb\" (UniqueName: \"kubernetes.io/projected/141eeb26-5732-4466-8de8-641334400202-kube-api-access-wglsb\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:23.850558 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.850474 2571 patch_prober.go:28] interesting pod/image-registry-56998694b4-5kqfw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 00:04:23.850719 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:23.850578 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 00:04:24.031819 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:24.031775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:24.034037 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:24.034010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/141eeb26-5732-4466-8de8-641334400202-node-exporter-tls\") pod \"node-exporter-hjvz8\" (UID: \"141eeb26-5732-4466-8de8-641334400202\") " pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:24.322970 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:24.322930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjvz8" Apr 21 00:04:24.333386 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:04:24.333348 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141eeb26_5732_4466_8de8_641334400202.slice/crio-e0d8bf0ee91426fd4a5211d7be496bfb08746544a1fb3b2ac61e6ad991d20d85 WatchSource:0}: Error finding container e0d8bf0ee91426fd4a5211d7be496bfb08746544a1fb3b2ac61e6ad991d20d85: Status 404 returned error can't find the container with id e0d8bf0ee91426fd4a5211d7be496bfb08746544a1fb3b2ac61e6ad991d20d85 Apr 21 00:04:24.433189 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:24.433150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjvz8" event={"ID":"141eeb26-5732-4466-8de8-641334400202","Type":"ContainerStarted","Data":"e0d8bf0ee91426fd4a5211d7be496bfb08746544a1fb3b2ac61e6ad991d20d85"} Apr 21 00:04:25.359932 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:25.359903 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:25.439198 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:25.439051 2571 generic.go:358] "Generic (PLEG): container finished" podID="141eeb26-5732-4466-8de8-641334400202" containerID="8e1264db048144336cdd572ededad5bfd6b2ec33b82479e6c95af638b65e96c5" exitCode=0 Apr 21 00:04:25.439198 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:25.439127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjvz8" event={"ID":"141eeb26-5732-4466-8de8-641334400202","Type":"ContainerDied","Data":"8e1264db048144336cdd572ededad5bfd6b2ec33b82479e6c95af638b65e96c5"} Apr 21 00:04:26.444233 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:26.444198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjvz8" event={"ID":"141eeb26-5732-4466-8de8-641334400202","Type":"ContainerStarted","Data":"3bb252ed8c92cc1fc45b182b26f9ef554936249c23f0316fd26910005681dc9a"} Apr 21 00:04:26.444233 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:26.444234 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjvz8" event={"ID":"141eeb26-5732-4466-8de8-641334400202","Type":"ContainerStarted","Data":"286bd3586b47262f3693f30b2019e96eeda912832c5f3f9dd2d895ef1e1e24a3"} Apr 21 00:04:26.464130 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:26.464063 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hjvz8" podStartSLOduration=2.663630351 podStartE2EDuration="3.464046553s" podCreationTimestamp="2026-04-21 00:04:23 +0000 UTC" firstStartedPulling="2026-04-21 00:04:24.335735399 +0000 UTC m=+85.949106897" lastFinishedPulling="2026-04-21 00:04:25.136151599 +0000 UTC m=+86.749523099" observedRunningTime="2026-04-21 00:04:26.462048302 +0000 UTC m=+88.075419837" watchObservedRunningTime="2026-04-21 00:04:26.464046553 +0000 UTC m=+88.077418073" Apr 21 00:04:31.105664 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:31.105625 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:04:52.524725 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.524692 2571 generic.go:358] "Generic (PLEG): container finished" podID="73af315a-43db-4687-aa2e-2555ab2f3d65" containerID="eb49b58049035fe27bafebe26b463c0d1fa38abba4d8dbf5d39bcb9a287471cf" exitCode=0 Apr 21 00:04:52.525191 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.524765 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4dln" event={"ID":"73af315a-43db-4687-aa2e-2555ab2f3d65","Type":"ContainerDied","Data":"eb49b58049035fe27bafebe26b463c0d1fa38abba4d8dbf5d39bcb9a287471cf"} Apr 21 00:04:52.525249 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.525199 2571 scope.go:117] "RemoveContainer" containerID="eb49b58049035fe27bafebe26b463c0d1fa38abba4d8dbf5d39bcb9a287471cf" Apr 21 00:04:52.526152 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.526132 2571 generic.go:358] "Generic (PLEG): container finished" podID="010e4dff-e2ae-4168-aa42-10e2537edc3c" containerID="221640c50e0ff7690b25c37ed1eb3b6594e193549809320aab6fc4f8bfc58332" exitCode=0 Apr 21 00:04:52.526261 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.526178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" event={"ID":"010e4dff-e2ae-4168-aa42-10e2537edc3c","Type":"ContainerDied","Data":"221640c50e0ff7690b25c37ed1eb3b6594e193549809320aab6fc4f8bfc58332"} Apr 21 00:04:52.526487 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:52.526472 2571 scope.go:117] "RemoveContainer" containerID="221640c50e0ff7690b25c37ed1eb3b6594e193549809320aab6fc4f8bfc58332" Apr 21 00:04:53.213316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:53.213286 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69675cd558-ml7nv_e4103e8a-c222-4304-953c-43f57e73acef/router/0.log" Apr 21 00:04:53.218513 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:53.218491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6w5rw_e28815ec-1f97-4757-b463-8aec1ad6b01e/serve-healthcheck-canary/0.log" Apr 21 00:04:53.530946 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:53.530907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4dln" event={"ID":"73af315a-43db-4687-aa2e-2555ab2f3d65","Type":"ContainerStarted","Data":"cdc7ac69b99e53a1df9053faef1f056b701666914fd2dbca6fd10ed3aed5adc9"} Apr 21 00:04:53.532557 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:53.532532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-m59z4" event={"ID":"010e4dff-e2ae-4168-aa42-10e2537edc3c","Type":"ContainerStarted","Data":"d9d40af4d412a45646689d6fcea61d6e964f358b7387804c07647afa68b1922c"} Apr 21 00:04:56.124314 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.124255 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerName="registry" containerID="cri-o://1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4" gracePeriod=30 Apr 21 00:04:56.361639 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.361617 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:56.500457 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500381 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500457 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500418 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500457 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500440 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500473 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9ctx\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500507 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500530 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500560 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.500735 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500601 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token\") pod \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\" (UID: \"b4420432-cd86-4b5b-a350-f40c3c3cb85b\") " Apr 21 00:04:56.501028 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.500987 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 00:04:56.501189 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.501063 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 00:04:56.503216 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.503188 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 00:04:56.503316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.503252 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:04:56.503367 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.503334 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx" (OuterVolumeSpecName: "kube-api-access-d9ctx") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "kube-api-access-d9ctx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:04:56.503404 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.503370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 00:04:56.503404 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.503378 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:04:56.511937 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.511913 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b4420432-cd86-4b5b-a350-f40c3c3cb85b" (UID: "b4420432-cd86-4b5b-a350-f40c3c3cb85b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 00:04:56.542251 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.542226 2571 generic.go:358] "Generic (PLEG): container finished" podID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerID="1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4" exitCode=0 Apr 21 00:04:56.542363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.542281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" event={"ID":"b4420432-cd86-4b5b-a350-f40c3c3cb85b","Type":"ContainerDied","Data":"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4"} Apr 21 00:04:56.542363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.542290 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" Apr 21 00:04:56.542363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.542307 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56998694b4-5kqfw" event={"ID":"b4420432-cd86-4b5b-a350-f40c3c3cb85b","Type":"ContainerDied","Data":"62d6a7e2c4619b7c9ee6d7f448e917f58fecbccd41753c6ec3a46fcf2d52cfdc"} Apr 21 00:04:56.542363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.542322 2571 scope.go:117] "RemoveContainer" containerID="1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4" Apr 21 00:04:56.555681 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.555664 2571 scope.go:117] "RemoveContainer" containerID="1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4" Apr 21 00:04:56.555944 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:04:56.555919 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4\": container with ID starting with 1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4 not found: ID does not exist" containerID="1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4" Apr 21 00:04:56.556000 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.555953 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4"} err="failed to get container status \"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4\": rpc error: code = NotFound desc = could not find container \"1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4\": container with ID starting with 1e0b3cebf091714f2059911a02905d57e700ab2965d71bfc814db69a116b2da4 not found: ID does not exist" Apr 21 00:04:56.565949 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.565926 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:04:56.570575 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.570553 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56998694b4-5kqfw"] Apr 21 00:04:56.601866 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601841 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d9ctx\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-kube-api-access-d9ctx\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601866 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601863 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-image-registry-private-configuration\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601874 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4420432-cd86-4b5b-a350-f40c3c3cb85b-installation-pull-secrets\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601884 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-tls\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601893 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4420432-cd86-4b5b-a350-f40c3c3cb85b-bound-sa-token\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601902 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-registry-certificates\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601910 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4420432-cd86-4b5b-a350-f40c3c3cb85b-trusted-ca\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.601984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.601919 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4420432-cd86-4b5b-a350-f40c3c3cb85b-ca-trust-extracted\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:04:56.990004 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:56.989973 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" path="/var/lib/kubelet/pods/b4420432-cd86-4b5b-a350-f40c3c3cb85b/volumes" Apr 21 00:04:57.546526 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:57.546490 2571 generic.go:358] "Generic (PLEG): container finished" podID="37a171e5-c09c-4c2d-ad77-c2c5a6027985" containerID="1f4014c7666072817502106d9e314d3870426b9ba2ae6e2e2c4b02138ca55e6c" exitCode=0 Apr 21 00:04:57.546953 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:57.546562 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" event={"ID":"37a171e5-c09c-4c2d-ad77-c2c5a6027985","Type":"ContainerDied","Data":"1f4014c7666072817502106d9e314d3870426b9ba2ae6e2e2c4b02138ca55e6c"} Apr 21 00:04:57.546953 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:57.546922 2571 scope.go:117] "RemoveContainer" containerID="1f4014c7666072817502106d9e314d3870426b9ba2ae6e2e2c4b02138ca55e6c" Apr 21 00:04:58.554666 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:04:58.554630 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6kl9" event={"ID":"37a171e5-c09c-4c2d-ad77-c2c5a6027985","Type":"ContainerStarted","Data":"52fff7d3ae4117a03b5dd198138b7097fbbce0e8835e7ecdaafe045638976b3d"} Apr 21 00:05:04.574765 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:04.574717 2571 generic.go:358] "Generic (PLEG): container finished" podID="8c729e77-783c-4ed2-831a-538689f33279" containerID="45de2ef2562e717b7d0d7d8888c6c1d46c2c5fc7780fcdfc75c4bb0d07da8f78" exitCode=0 Apr 21 00:05:04.575145 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:04.574787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29612160-ghs4n" event={"ID":"8c729e77-783c-4ed2-831a-538689f33279","Type":"ContainerDied","Data":"45de2ef2562e717b7d0d7d8888c6c1d46c2c5fc7780fcdfc75c4bb0d07da8f78"} Apr 21 00:05:05.730964 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.730938 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:05:05.885675 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.885590 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca\") pod \"8c729e77-783c-4ed2-831a-538689f33279\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " Apr 21 00:05:05.885838 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.885690 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6btc\" (UniqueName: \"kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc\") pod \"8c729e77-783c-4ed2-831a-538689f33279\" (UID: \"8c729e77-783c-4ed2-831a-538689f33279\") " Apr 21 00:05:05.886005 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.885974 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca" (OuterVolumeSpecName: "serviceca") pod "8c729e77-783c-4ed2-831a-538689f33279" (UID: "8c729e77-783c-4ed2-831a-538689f33279"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 00:05:05.888589 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.888543 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc" (OuterVolumeSpecName: "kube-api-access-d6btc") pod "8c729e77-783c-4ed2-831a-538689f33279" (UID: "8c729e77-783c-4ed2-831a-538689f33279"). InnerVolumeSpecName "kube-api-access-d6btc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:05:05.987420 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.987305 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6btc\" (UniqueName: \"kubernetes.io/projected/8c729e77-783c-4ed2-831a-538689f33279-kube-api-access-d6btc\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:05:05.987420 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:05.987346 2571 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c729e77-783c-4ed2-831a-538689f33279-serviceca\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:05:06.588122 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:06.584795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29612160-ghs4n" event={"ID":"8c729e77-783c-4ed2-831a-538689f33279","Type":"ContainerDied","Data":"df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed"} Apr 21 00:05:06.588122 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:06.584848 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8d619e32d0c4e99f005393b4afbbaa86cbdbed422cbeaac1573afc5171efed" Apr 21 00:05:06.588122 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:06.584942 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29612160-ghs4n" Apr 21 00:05:12.266041 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:12.265992 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" podUID="25ad4b12-b3eb-4705-9146-fc282c21c226" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 00:05:22.266407 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:22.266366 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" podUID="25ad4b12-b3eb-4705-9146-fc282c21c226" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 00:05:32.266498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.266457 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" podUID="25ad4b12-b3eb-4705-9146-fc282c21c226" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 00:05:32.266947 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.266536 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" Apr 21 00:05:32.267032 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.267013 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e4cc8457aa83530cb47a6f67dd8bb334477a7487fa68d07f990c3502b7fca0ef"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 00:05:32.267074 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.267051 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" podUID="25ad4b12-b3eb-4705-9146-fc282c21c226" containerName="service-proxy" containerID="cri-o://e4cc8457aa83530cb47a6f67dd8bb334477a7487fa68d07f990c3502b7fca0ef" gracePeriod=30 Apr 21 00:05:32.658830 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.658743 2571 generic.go:358] "Generic (PLEG): container finished" podID="25ad4b12-b3eb-4705-9146-fc282c21c226" containerID="e4cc8457aa83530cb47a6f67dd8bb334477a7487fa68d07f990c3502b7fca0ef" exitCode=2 Apr 21 00:05:32.658830 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.658810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerDied","Data":"e4cc8457aa83530cb47a6f67dd8bb334477a7487fa68d07f990c3502b7fca0ef"} Apr 21 00:05:32.659040 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:05:32.658858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c6756898-dclgj" event={"ID":"25ad4b12-b3eb-4705-9146-fc282c21c226","Type":"ContainerStarted","Data":"5be4ef58ae2f76e916047b29f94318197f75e39af0fc35ec0f7a2a45af9a9525"} Apr 21 00:07:58.882966 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:07:58.882933 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:07:58.885782 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:07:58.885760 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:07:58.891140 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:07:58.891118 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 00:08:48.640642 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.640555 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74"] Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.640919 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c729e77-783c-4ed2-831a-538689f33279" containerName="image-pruner" Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.640935 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c729e77-783c-4ed2-831a-538689f33279" containerName="image-pruner" Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.640954 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerName="registry" Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.640960 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerName="registry" Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.641014 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4420432-cd86-4b5b-a350-f40c3c3cb85b" containerName="registry" Apr 21 00:08:48.641164 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.641022 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c729e77-783c-4ed2-831a-538689f33279" containerName="image-pruner" Apr 21 00:08:48.643567 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.643548 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.646085 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.646057 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 00:08:48.646219 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.646189 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-fjxgs\"" Apr 21 00:08:48.646219 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.646205 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 00:08:48.646331 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.646312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 00:08:48.646554 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.646539 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 00:08:48.656870 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.656839 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74"] Apr 21 00:08:48.733500 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.733470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-apiservice-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.733623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.733503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-webhook-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.733623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.733539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pvj\" (UniqueName: \"kubernetes.io/projected/34695680-8474-417f-927b-85cfda2c3b18-kube-api-access-l8pvj\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.834341 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.834311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-apiservice-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.834443 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.834348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-webhook-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.834443 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.834397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pvj\" (UniqueName: \"kubernetes.io/projected/34695680-8474-417f-927b-85cfda2c3b18-kube-api-access-l8pvj\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.836666 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.836643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-apiservice-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.836768 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.836693 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34695680-8474-417f-927b-85cfda2c3b18-webhook-cert\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.841963 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.841942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pvj\" (UniqueName: \"kubernetes.io/projected/34695680-8474-417f-927b-85cfda2c3b18-kube-api-access-l8pvj\") pod \"opendatahub-operator-controller-manager-587f5698df-7dp74\" (UID: \"34695680-8474-417f-927b-85cfda2c3b18\") " pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:48.953737 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:48.953666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:49.084863 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:49.084842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74"] Apr 21 00:08:49.087064 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:08:49.087037 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34695680_8474_417f_927b_85cfda2c3b18.slice/crio-6b0f1abedab68c8c5d9001f00243fc52cb6a04bec714f0bcd4edda00d6b4ef88 WatchSource:0}: Error finding container 6b0f1abedab68c8c5d9001f00243fc52cb6a04bec714f0bcd4edda00d6b4ef88: Status 404 returned error can't find the container with id 6b0f1abedab68c8c5d9001f00243fc52cb6a04bec714f0bcd4edda00d6b4ef88 Apr 21 00:08:49.088681 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:49.088661 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 00:08:49.194271 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:49.194240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" event={"ID":"34695680-8474-417f-927b-85cfda2c3b18","Type":"ContainerStarted","Data":"6b0f1abedab68c8c5d9001f00243fc52cb6a04bec714f0bcd4edda00d6b4ef88"} Apr 21 00:08:52.207983 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:52.207948 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" event={"ID":"34695680-8474-417f-927b-85cfda2c3b18","Type":"ContainerStarted","Data":"25689e5b292d2aa8203492c21677062e117e4dca75bb00b8d6810d2274e7b5aa"} Apr 21 00:08:52.208379 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:52.208108 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:08:52.227559 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:08:52.227516 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" podStartSLOduration=1.801787199 podStartE2EDuration="4.227505315s" podCreationTimestamp="2026-04-21 00:08:48 +0000 UTC" firstStartedPulling="2026-04-21 00:08:49.088785703 +0000 UTC m=+350.702157201" lastFinishedPulling="2026-04-21 00:08:51.514503808 +0000 UTC m=+353.127875317" observedRunningTime="2026-04-21 00:08:52.22588487 +0000 UTC m=+353.839256402" watchObservedRunningTime="2026-04-21 00:08:52.227505315 +0000 UTC m=+353.840876897" Apr 21 00:09:03.212440 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:03.212407 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-587f5698df-7dp74" Apr 21 00:09:09.317020 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.316983 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-df9nr"] Apr 21 00:09:09.320243 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.320226 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.322692 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.322673 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 00:09:09.322799 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.322700 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-hdbkf\"" Apr 21 00:09:09.329453 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.329426 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-df9nr"] Apr 21 00:09:09.387348 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.387321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.387462 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.387362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmw7\" (UniqueName: \"kubernetes.io/projected/aca2acb4-b224-4ea6-95d4-acfc142efd80-kube-api-access-6bmw7\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.488480 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.488447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.488601 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.488499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmw7\" (UniqueName: \"kubernetes.io/projected/aca2acb4-b224-4ea6-95d4-acfc142efd80-kube-api-access-6bmw7\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.488639 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:09.488599 2571 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 00:09:09.488691 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:09.488680 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert podName:aca2acb4-b224-4ea6-95d4-acfc142efd80 nodeName:}" failed. No retries permitted until 2026-04-21 00:09:09.988659375 +0000 UTC m=+371.602030880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert") pod "odh-model-controller-858dbf95b8-df9nr" (UID: "aca2acb4-b224-4ea6-95d4-acfc142efd80") : secret "odh-model-controller-webhook-cert" not found Apr 21 00:09:09.499208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.499182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmw7\" (UniqueName: \"kubernetes.io/projected/aca2acb4-b224-4ea6-95d4-acfc142efd80-kube-api-access-6bmw7\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.993038 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:09.993001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:09.993229 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:09.993161 2571 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 00:09:09.993229 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:09.993225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert podName:aca2acb4-b224-4ea6-95d4-acfc142efd80 nodeName:}" failed. No retries permitted until 2026-04-21 00:09:10.993209814 +0000 UTC m=+372.606581318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert") pod "odh-model-controller-858dbf95b8-df9nr" (UID: "aca2acb4-b224-4ea6-95d4-acfc142efd80") : secret "odh-model-controller-webhook-cert" not found Apr 21 00:09:11.001775 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:11.001744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:11.004132 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:11.004112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aca2acb4-b224-4ea6-95d4-acfc142efd80-cert\") pod \"odh-model-controller-858dbf95b8-df9nr\" (UID: \"aca2acb4-b224-4ea6-95d4-acfc142efd80\") " pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:11.134152 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:11.134116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:11.251056 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:11.251030 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-df9nr"] Apr 21 00:09:11.253475 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:09:11.253439 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca2acb4_b224_4ea6_95d4_acfc142efd80.slice/crio-4f417b5af835c1350fdee5e6eef2925787178d47e30679ba2588fa068555b48a WatchSource:0}: Error finding container 4f417b5af835c1350fdee5e6eef2925787178d47e30679ba2588fa068555b48a: Status 404 returned error can't find the container with id 4f417b5af835c1350fdee5e6eef2925787178d47e30679ba2588fa068555b48a Apr 21 00:09:11.271611 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:11.271582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" event={"ID":"aca2acb4-b224-4ea6-95d4-acfc142efd80","Type":"ContainerStarted","Data":"4f417b5af835c1350fdee5e6eef2925787178d47e30679ba2588fa068555b48a"} Apr 21 00:09:14.283160 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.283123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" event={"ID":"aca2acb4-b224-4ea6-95d4-acfc142efd80","Type":"ContainerStarted","Data":"357f7f8c30b316aa0030971285562b722cacdfc0f1a7ed3dd60b7b6d3709a308"} Apr 21 00:09:14.283526 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.283263 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:14.311483 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.311436 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" podStartSLOduration=2.571266003 podStartE2EDuration="5.31142145s" podCreationTimestamp="2026-04-21 00:09:09 +0000 UTC" firstStartedPulling="2026-04-21 00:09:11.254759726 +0000 UTC m=+372.868131227" lastFinishedPulling="2026-04-21 00:09:13.994915175 +0000 UTC m=+375.608286674" observedRunningTime="2026-04-21 00:09:14.31013064 +0000 UTC m=+375.923502165" watchObservedRunningTime="2026-04-21 00:09:14.31142145 +0000 UTC m=+375.924793011" Apr 21 00:09:14.432635 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.432599 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rsvtb"] Apr 21 00:09:14.436130 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.436111 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:14.438456 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.438424 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 00:09:14.438597 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.438459 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-9dmtc\"" Apr 21 00:09:14.447169 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.447005 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rsvtb"] Apr 21 00:09:14.537197 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.537087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:14.537197 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.537163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrv4x\" (UniqueName: \"kubernetes.io/projected/55e50a99-fa92-4a13-8811-3ef46753ab44-kube-api-access-wrv4x\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:14.638449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.638415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrv4x\" (UniqueName: \"kubernetes.io/projected/55e50a99-fa92-4a13-8811-3ef46753ab44-kube-api-access-wrv4x\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:14.638623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.638505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:14.638623 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:14.638599 2571 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 00:09:14.638692 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:09:14.638661 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert podName:55e50a99-fa92-4a13-8811-3ef46753ab44 nodeName:}" failed. No retries permitted until 2026-04-21 00:09:15.138645937 +0000 UTC m=+376.752017436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert") pod "kserve-controller-manager-856948b99f-rsvtb" (UID: "55e50a99-fa92-4a13-8811-3ef46753ab44") : secret "kserve-webhook-server-cert" not found Apr 21 00:09:14.647049 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:14.647023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrv4x\" (UniqueName: \"kubernetes.io/projected/55e50a99-fa92-4a13-8811-3ef46753ab44-kube-api-access-wrv4x\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:15.143570 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:15.143534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:15.146336 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:15.146304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55e50a99-fa92-4a13-8811-3ef46753ab44-cert\") pod \"kserve-controller-manager-856948b99f-rsvtb\" (UID: \"55e50a99-fa92-4a13-8811-3ef46753ab44\") " pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:15.349469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:15.349434 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:15.469029 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:15.468986 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rsvtb"] Apr 21 00:09:15.472104 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:09:15.472065 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e50a99_fa92_4a13_8811_3ef46753ab44.slice/crio-3faf600677e4f0437862cec46e4b33d90ecc442319977e4d87ba650006e6bf3a WatchSource:0}: Error finding container 3faf600677e4f0437862cec46e4b33d90ecc442319977e4d87ba650006e6bf3a: Status 404 returned error can't find the container with id 3faf600677e4f0437862cec46e4b33d90ecc442319977e4d87ba650006e6bf3a Apr 21 00:09:16.290448 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.290407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" event={"ID":"55e50a99-fa92-4a13-8811-3ef46753ab44","Type":"ContainerStarted","Data":"3faf600677e4f0437862cec46e4b33d90ecc442319977e4d87ba650006e6bf3a"} Apr 21 00:09:16.594242 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.594158 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r"] Apr 21 00:09:16.597384 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.597361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.599870 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.599847 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 00:09:16.599977 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.599855 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 00:09:16.603983 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.603956 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r"] Apr 21 00:09:16.658223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.658189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56da5d14-f885-45da-9fb2-726093fe1373-tmp\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.658367 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.658281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da5d14-f885-45da-9fb2-726093fe1373-tls-certs\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.658442 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.658424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8tvz\" (UniqueName: \"kubernetes.io/projected/56da5d14-f885-45da-9fb2-726093fe1373-kube-api-access-z8tvz\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.759681 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.759650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56da5d14-f885-45da-9fb2-726093fe1373-tmp\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.759799 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.759683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da5d14-f885-45da-9fb2-726093fe1373-tls-certs\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.759799 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.759706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8tvz\" (UniqueName: \"kubernetes.io/projected/56da5d14-f885-45da-9fb2-726093fe1373-kube-api-access-z8tvz\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.761944 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.761909 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56da5d14-f885-45da-9fb2-726093fe1373-tmp\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.762074 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.762054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da5d14-f885-45da-9fb2-726093fe1373-tls-certs\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.766944 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.766925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8tvz\" (UniqueName: \"kubernetes.io/projected/56da5d14-f885-45da-9fb2-726093fe1373-kube-api-access-z8tvz\") pod \"kube-auth-proxy-f6b6bc78b-8qr6r\" (UID: \"56da5d14-f885-45da-9fb2-726093fe1373\") " pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:16.908022 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:16.907952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" Apr 21 00:09:17.043367 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:17.043338 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r"] Apr 21 00:09:17.045921 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:09:17.045892 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56da5d14_f885_45da_9fb2_726093fe1373.slice/crio-39adf6436deb4bd9e460a2101387ac099ea609a9d0fc1067d50f00b4c7da4a71 WatchSource:0}: Error finding container 39adf6436deb4bd9e460a2101387ac099ea609a9d0fc1067d50f00b4c7da4a71: Status 404 returned error can't find the container with id 39adf6436deb4bd9e460a2101387ac099ea609a9d0fc1067d50f00b4c7da4a71 Apr 21 00:09:17.294882 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:17.294844 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" event={"ID":"56da5d14-f885-45da-9fb2-726093fe1373","Type":"ContainerStarted","Data":"39adf6436deb4bd9e460a2101387ac099ea609a9d0fc1067d50f00b4c7da4a71"} Apr 21 00:09:20.312512 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:20.312471 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" event={"ID":"55e50a99-fa92-4a13-8811-3ef46753ab44","Type":"ContainerStarted","Data":"fd92b208c2ff30bd4ac72d490262443bd88423dc59969a60e620ac15eda3bda0"} Apr 21 00:09:20.312927 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:20.312690 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:09:20.326817 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:20.326772 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" podStartSLOduration=2.546828422 podStartE2EDuration="6.326760604s" podCreationTimestamp="2026-04-21 00:09:14 +0000 UTC" firstStartedPulling="2026-04-21 00:09:15.47378729 +0000 UTC m=+377.087158788" lastFinishedPulling="2026-04-21 00:09:19.25371947 +0000 UTC m=+380.867090970" observedRunningTime="2026-04-21 00:09:20.326440632 +0000 UTC m=+381.939812157" watchObservedRunningTime="2026-04-21 00:09:20.326760604 +0000 UTC m=+381.940132167" Apr 21 00:09:21.317020 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:21.316981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" event={"ID":"56da5d14-f885-45da-9fb2-726093fe1373","Type":"ContainerStarted","Data":"b1fb33456d55113e874b0e22b7450989f384b3b5075dbed567b7f46159c45b18"} Apr 21 00:09:21.332549 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:21.332506 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-f6b6bc78b-8qr6r" podStartSLOduration=2.008269978 podStartE2EDuration="5.332491225s" podCreationTimestamp="2026-04-21 00:09:16 +0000 UTC" firstStartedPulling="2026-04-21 00:09:17.047715754 +0000 UTC m=+378.661087252" lastFinishedPulling="2026-04-21 00:09:20.371937001 +0000 UTC m=+381.985308499" observedRunningTime="2026-04-21 00:09:21.331177129 +0000 UTC m=+382.944548650" watchObservedRunningTime="2026-04-21 00:09:21.332491225 +0000 UTC m=+382.945862745" Apr 21 00:09:25.289354 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:25.289324 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-df9nr" Apr 21 00:09:34.911582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.911542 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm"] Apr 21 00:09:34.917673 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.917653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:34.920516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.920479 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 00:09:34.920642 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.920561 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 00:09:34.920642 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.920491 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-9bmsl\"" Apr 21 00:09:34.926177 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:34.926155 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm"] Apr 21 00:09:35.023395 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.023362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f530b692-c500-4d33-a236-533a6667d8e2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.023529 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.023446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jpg\" (UniqueName: \"kubernetes.io/projected/f530b692-c500-4d33-a236-533a6667d8e2-kube-api-access-k5jpg\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.124713 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.124677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jpg\" (UniqueName: \"kubernetes.io/projected/f530b692-c500-4d33-a236-533a6667d8e2-kube-api-access-k5jpg\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.124861 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.124737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f530b692-c500-4d33-a236-533a6667d8e2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.127182 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.127154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f530b692-c500-4d33-a236-533a6667d8e2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.157619 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.157593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jpg\" (UniqueName: \"kubernetes.io/projected/f530b692-c500-4d33-a236-533a6667d8e2-kube-api-access-k5jpg\") pod \"servicemesh-operator3-55f49c5f94-2zhxm\" (UID: \"f530b692-c500-4d33-a236-533a6667d8e2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.227430 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.227336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:35.363449 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:35.363420 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm"] Apr 21 00:09:35.367008 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:09:35.366979 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf530b692_c500_4d33_a236_533a6667d8e2.slice/crio-a95e6cc2f6b1233d43113925f95130043c68fb23d48984ad16faf047ef840bb7 WatchSource:0}: Error finding container a95e6cc2f6b1233d43113925f95130043c68fb23d48984ad16faf047ef840bb7: Status 404 returned error can't find the container with id a95e6cc2f6b1233d43113925f95130043c68fb23d48984ad16faf047ef840bb7 Apr 21 00:09:36.369516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:36.369465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" event={"ID":"f530b692-c500-4d33-a236-533a6667d8e2","Type":"ContainerStarted","Data":"a95e6cc2f6b1233d43113925f95130043c68fb23d48984ad16faf047ef840bb7"} Apr 21 00:09:38.379785 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:38.379750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" event={"ID":"f530b692-c500-4d33-a236-533a6667d8e2","Type":"ContainerStarted","Data":"10be50c1224dc371e989dcd6dea4bce0a2bb4b6b6d535b1f94ef54755c62a4bc"} Apr 21 00:09:38.380201 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:38.379878 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:38.399933 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:38.399889 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" podStartSLOduration=2.039656252 podStartE2EDuration="4.399876642s" podCreationTimestamp="2026-04-21 00:09:34 +0000 UTC" firstStartedPulling="2026-04-21 00:09:35.369524239 +0000 UTC m=+396.982895741" lastFinishedPulling="2026-04-21 00:09:37.729744632 +0000 UTC m=+399.343116131" observedRunningTime="2026-04-21 00:09:38.398221524 +0000 UTC m=+400.011593047" watchObservedRunningTime="2026-04-21 00:09:38.399876642 +0000 UTC m=+400.013248164" Apr 21 00:09:44.565171 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.565130 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z"] Apr 21 00:09:44.567378 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.567361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.569904 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.569879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 00:09:44.570029 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.569882 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 00:09:44.570138 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.569890 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 00:09:44.570138 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.569900 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 00:09:44.570138 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.569961 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-sd2kn\"" Apr 21 00:09:44.578883 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.578861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z"] Apr 21 00:09:44.594369 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594471 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594471 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594541 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkvnb\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-kube-api-access-vkvnb\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594611 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594742 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.594817 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.594734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/175da6a9-33ab-4c24-b1aa-2909592e217a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695740 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695740 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkvnb\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-kube-api-access-vkvnb\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695856 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.695959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.695880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/175da6a9-33ab-4c24-b1aa-2909592e217a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.696501 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.696474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.698157 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.698133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/175da6a9-33ab-4c24-b1aa-2909592e217a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.698287 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.698270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.698521 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.698506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.698583 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.698564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.703156 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.703136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.703602 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.703581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkvnb\" (UniqueName: \"kubernetes.io/projected/175da6a9-33ab-4c24-b1aa-2909592e217a-kube-api-access-vkvnb\") pod \"istiod-openshift-gateway-55ff986f96-gpv7z\" (UID: \"175da6a9-33ab-4c24-b1aa-2909592e217a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:44.877583 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:44.877504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:45.011710 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:45.011687 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z"] Apr 21 00:09:45.013908 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:09:45.013879 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175da6a9_33ab_4c24_b1aa_2909592e217a.slice/crio-b50d2d513f70744f6da6d51b4015d70c5100f5743e129fa74ebfa00b1ac21085 WatchSource:0}: Error finding container b50d2d513f70744f6da6d51b4015d70c5100f5743e129fa74ebfa00b1ac21085: Status 404 returned error can't find the container with id b50d2d513f70744f6da6d51b4015d70c5100f5743e129fa74ebfa00b1ac21085 Apr 21 00:09:45.403241 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:45.403197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" event={"ID":"175da6a9-33ab-4c24-b1aa-2909592e217a","Type":"ContainerStarted","Data":"b50d2d513f70744f6da6d51b4015d70c5100f5743e129fa74ebfa00b1ac21085"} Apr 21 00:09:48.728905 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:48.728862 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 00:09:48.729292 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:48.728942 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 00:09:49.384804 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.384773 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2zhxm" Apr 21 00:09:49.418225 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.418188 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" event={"ID":"175da6a9-33ab-4c24-b1aa-2909592e217a","Type":"ContainerStarted","Data":"a248f96d77f273bf608bd97d57d91c685480fd6c84a19911a46400bf8e7fc5a5"} Apr 21 00:09:49.418411 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.418302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:49.419817 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.419789 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-gpv7z container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 00:09:49.419925 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.419846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" podUID="175da6a9-33ab-4c24-b1aa-2909592e217a" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 00:09:49.438689 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:49.438620 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" podStartSLOduration=1.725611041 podStartE2EDuration="5.438602362s" podCreationTimestamp="2026-04-21 00:09:44 +0000 UTC" firstStartedPulling="2026-04-21 00:09:45.015625893 +0000 UTC m=+406.628997392" lastFinishedPulling="2026-04-21 00:09:48.728617211 +0000 UTC m=+410.341988713" observedRunningTime="2026-04-21 00:09:49.437725962 +0000 UTC m=+411.051097484" watchObservedRunningTime="2026-04-21 00:09:49.438602362 +0000 UTC m=+411.051973884" Apr 21 00:09:50.422853 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:50.422826 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gpv7z" Apr 21 00:09:51.322674 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:09:51.322639 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-rsvtb" Apr 21 00:10:45.229587 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.229554 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6"] Apr 21 00:10:45.234520 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.234497 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:45.237208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.237191 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 00:10:45.238054 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.238036 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-59ftr\"" Apr 21 00:10:45.238167 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.238053 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 00:10:45.238167 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.238053 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 00:10:45.242498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.242477 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6"] Apr 21 00:10:45.287790 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.287765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdhq\" (UniqueName: \"kubernetes.io/projected/466b6843-581d-41df-9774-68a94c072b4f-kube-api-access-gcdhq\") pod \"dns-operator-controller-manager-648d5c98bc-qb8p6\" (UID: \"466b6843-581d-41df-9774-68a94c072b4f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:45.388209 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.388179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdhq\" (UniqueName: \"kubernetes.io/projected/466b6843-581d-41df-9774-68a94c072b4f-kube-api-access-gcdhq\") pod \"dns-operator-controller-manager-648d5c98bc-qb8p6\" (UID: \"466b6843-581d-41df-9774-68a94c072b4f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:45.399070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.399044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdhq\" (UniqueName: \"kubernetes.io/projected/466b6843-581d-41df-9774-68a94c072b4f-kube-api-access-gcdhq\") pod \"dns-operator-controller-manager-648d5c98bc-qb8p6\" (UID: \"466b6843-581d-41df-9774-68a94c072b4f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:45.545180 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.545150 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:45.666470 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:45.666433 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6"] Apr 21 00:10:45.669524 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:10:45.669501 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod466b6843_581d_41df_9774_68a94c072b4f.slice/crio-9c6c71301e2d80bb0c23457f6a59b38c1a44249a8a2cc25918a7104386b3e254 WatchSource:0}: Error finding container 9c6c71301e2d80bb0c23457f6a59b38c1a44249a8a2cc25918a7104386b3e254: Status 404 returned error can't find the container with id 9c6c71301e2d80bb0c23457f6a59b38c1a44249a8a2cc25918a7104386b3e254 Apr 21 00:10:46.600412 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:46.600376 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" event={"ID":"466b6843-581d-41df-9774-68a94c072b4f","Type":"ContainerStarted","Data":"9c6c71301e2d80bb0c23457f6a59b38c1a44249a8a2cc25918a7104386b3e254"} Apr 21 00:10:48.609888 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:48.609850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" event={"ID":"466b6843-581d-41df-9774-68a94c072b4f","Type":"ContainerStarted","Data":"fae6dfb0bed842e77492c14f9e41456788e4a339947e6014120fb4b0c51d3212"} Apr 21 00:10:48.610293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:48.609972 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:10:48.625894 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:48.625852 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" podStartSLOduration=1.069320441 podStartE2EDuration="3.625840746s" podCreationTimestamp="2026-04-21 00:10:45 +0000 UTC" firstStartedPulling="2026-04-21 00:10:45.671436489 +0000 UTC m=+467.284807991" lastFinishedPulling="2026-04-21 00:10:48.227956783 +0000 UTC m=+469.841328296" observedRunningTime="2026-04-21 00:10:48.6244597 +0000 UTC m=+470.237831219" watchObservedRunningTime="2026-04-21 00:10:48.625840746 +0000 UTC m=+470.239212264" Apr 21 00:10:50.076793 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.076760 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7"] Apr 21 00:10:50.079032 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.079006 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.081482 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.081460 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7v22d\"" Apr 21 00:10:50.096309 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.096282 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7"] Apr 21 00:10:50.128381 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.128336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z884w\" (UniqueName: \"kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.128532 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.128392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.229484 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.229446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z884w\" (UniqueName: \"kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.229484 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.229487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.229852 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.229833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.242074 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.242043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z884w\" (UniqueName: \"kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.390102 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.390003 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:50.514335 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.514302 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7"] Apr 21 00:10:50.517229 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:10:50.517201 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77680a27_bc95_43c8_9111_9de1f2265c92.slice/crio-3b25f4562569a92521cc29ad139447229e1d4f805741f7bf75a41a168a905bed WatchSource:0}: Error finding container 3b25f4562569a92521cc29ad139447229e1d4f805741f7bf75a41a168a905bed: Status 404 returned error can't find the container with id 3b25f4562569a92521cc29ad139447229e1d4f805741f7bf75a41a168a905bed Apr 21 00:10:50.619856 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:50.619822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" event={"ID":"77680a27-bc95-43c8-9111-9de1f2265c92","Type":"ContainerStarted","Data":"3b25f4562569a92521cc29ad139447229e1d4f805741f7bf75a41a168a905bed"} Apr 21 00:10:56.641769 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:56.641724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" event={"ID":"77680a27-bc95-43c8-9111-9de1f2265c92","Type":"ContainerStarted","Data":"06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906"} Apr 21 00:10:56.642149 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:56.641794 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:10:56.661715 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:56.661663 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" podStartSLOduration=1.200424243 podStartE2EDuration="6.661648249s" podCreationTimestamp="2026-04-21 00:10:50 +0000 UTC" firstStartedPulling="2026-04-21 00:10:50.519635313 +0000 UTC m=+472.133006816" lastFinishedPulling="2026-04-21 00:10:55.980859314 +0000 UTC m=+477.594230822" observedRunningTime="2026-04-21 00:10:56.661206672 +0000 UTC m=+478.274578195" watchObservedRunningTime="2026-04-21 00:10:56.661648249 +0000 UTC m=+478.275019769" Apr 21 00:10:59.617754 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:10:59.617725 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qb8p6" Apr 21 00:11:07.648009 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:07.647978 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:11:09.306526 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.306485 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7"] Apr 21 00:11:09.306961 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.306724 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" containerName="manager" containerID="cri-o://06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906" gracePeriod=2 Apr 21 00:11:09.317215 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.317184 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7"] Apr 21 00:11:09.328605 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.328579 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:09.328914 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.328895 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" containerName="manager" Apr 21 00:11:09.328914 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.328908 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" containerName="manager" Apr 21 00:11:09.328994 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.328977 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" containerName="manager" Apr 21 00:11:09.330921 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.330905 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.342758 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.342735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:09.349294 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.349271 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:09.351787 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.351768 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.370938 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.370911 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:09.397365 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.397333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.397589 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.397432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctl59\" (UniqueName: \"kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.498752 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.498723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.498902 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.498764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.498902 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.498812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctl59\" (UniqueName: \"kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.498902 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.498841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.499127 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.499086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.507380 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.507356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctl59\" (UniqueName: \"kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p2zpw\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.536443 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.536422 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:11:09.538816 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.538792 2571 status_manager.go:895] "Failed to get status for pod" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" is forbidden: User \"system:node:ip-10-0-143-115.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-115.ec2.internal' and this object" Apr 21 00:11:09.600107 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600026 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume\") pod \"77680a27-bc95-43c8-9111-9de1f2265c92\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " Apr 21 00:11:09.600261 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600147 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z884w\" (UniqueName: \"kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w\") pod \"77680a27-bc95-43c8-9111-9de1f2265c92\" (UID: \"77680a27-bc95-43c8-9111-9de1f2265c92\") " Apr 21 00:11:09.600261 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.600378 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.600433 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600388 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "77680a27-bc95-43c8-9111-9de1f2265c92" (UID: "77680a27-bc95-43c8-9111-9de1f2265c92"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 00:11:09.600699 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.600678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.602273 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.602252 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w" (OuterVolumeSpecName: "kube-api-access-z884w") pod "77680a27-bc95-43c8-9111-9de1f2265c92" (UID: "77680a27-bc95-43c8-9111-9de1f2265c92"). InnerVolumeSpecName "kube-api-access-z884w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:11:09.610530 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.610509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.688496 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.688470 2571 generic.go:358] "Generic (PLEG): container finished" podID="77680a27-bc95-43c8-9111-9de1f2265c92" containerID="06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906" exitCode=0 Apr 21 00:11:09.688692 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.688519 2571 scope.go:117] "RemoveContainer" containerID="06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906" Apr 21 00:11:09.688692 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.688522 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" Apr 21 00:11:09.689388 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.689368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:09.691119 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.691070 2571 status_manager.go:895] "Failed to get status for pod" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" is forbidden: User \"system:node:ip-10-0-143-115.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-115.ec2.internal' and this object" Apr 21 00:11:09.697058 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.697038 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:09.698210 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.698190 2571 scope.go:117] "RemoveContainer" containerID="06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906" Apr 21 00:11:09.698489 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:11:09.698473 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906\": container with ID starting with 06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906 not found: ID does not exist" containerID="06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906" Apr 21 00:11:09.698542 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.698496 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906"} err="failed to get container status \"06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906\": rpc error: code = NotFound desc = could not find container \"06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906\": container with ID starting with 06673c2f6bfae3ead3be48fb47e53c52b773a32c7310162442929239407b2906 not found: ID does not exist" Apr 21 00:11:09.699208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.699180 2571 status_manager.go:895] "Failed to get status for pod" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" is forbidden: User \"system:node:ip-10-0-143-115.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-115.ec2.internal' and this object" Apr 21 00:11:09.701569 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.701544 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z884w\" (UniqueName: \"kubernetes.io/projected/77680a27-bc95-43c8-9111-9de1f2265c92-kube-api-access-z884w\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:09.701569 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.701568 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/77680a27-bc95-43c8-9111-9de1f2265c92-extensions-socket-volume\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:09.827470 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.827437 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:09.831017 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:11:09.830979 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c8fd86_bd30_4b77_bef3_be34d36d816e.slice/crio-d1cff0365533b18436f84cd888c4ede49a1139ef1bdffcf25bbd10a461bfc89b WatchSource:0}: Error finding container d1cff0365533b18436f84cd888c4ede49a1139ef1bdffcf25bbd10a461bfc89b: Status 404 returned error can't find the container with id d1cff0365533b18436f84cd888c4ede49a1139ef1bdffcf25bbd10a461bfc89b Apr 21 00:11:09.853459 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:09.853427 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:09.867235 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:11:09.867213 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462348dc_4600_4201_a04e_af5503d0d1f7.slice/crio-9a96f186a8cd75285bf21f0a2bd0c30ec04e9dbaaafe5f2f46ea0f4ff7051ea6 WatchSource:0}: Error finding container 9a96f186a8cd75285bf21f0a2bd0c30ec04e9dbaaafe5f2f46ea0f4ff7051ea6: Status 404 returned error can't find the container with id 9a96f186a8cd75285bf21f0a2bd0c30ec04e9dbaaafe5f2f46ea0f4ff7051ea6 Apr 21 00:11:10.693255 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.693218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" event={"ID":"e7c8fd86-bd30-4b77-bef3-be34d36d816e","Type":"ContainerStarted","Data":"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b"} Apr 21 00:11:10.693255 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.693255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" event={"ID":"e7c8fd86-bd30-4b77-bef3-be34d36d816e","Type":"ContainerStarted","Data":"d1cff0365533b18436f84cd888c4ede49a1139ef1bdffcf25bbd10a461bfc89b"} Apr 21 00:11:10.693727 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.693290 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:10.695273 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.695249 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" event={"ID":"462348dc-4600-4201-a04e-af5503d0d1f7","Type":"ContainerStarted","Data":"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a"} Apr 21 00:11:10.695273 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.695275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" event={"ID":"462348dc-4600-4201-a04e-af5503d0d1f7","Type":"ContainerStarted","Data":"9a96f186a8cd75285bf21f0a2bd0c30ec04e9dbaaafe5f2f46ea0f4ff7051ea6"} Apr 21 00:11:10.695401 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.695366 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:10.718755 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.718717 2571 status_manager.go:895] "Failed to get status for pod" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" is forbidden: User \"system:node:ip-10-0-143-115.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-115.ec2.internal' and this object" Apr 21 00:11:10.719285 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.719241 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" podStartSLOduration=1.719229178 podStartE2EDuration="1.719229178s" podCreationTimestamp="2026-04-21 00:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:11:10.716732479 +0000 UTC m=+492.330104013" watchObservedRunningTime="2026-04-21 00:11:10.719229178 +0000 UTC m=+492.332600698" Apr 21 00:11:10.720609 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.720585 2571 status_manager.go:895] "Failed to get status for pod" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pflc7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pflc7\" is forbidden: User \"system:node:ip-10-0-143-115.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-115.ec2.internal' and this object" Apr 21 00:11:10.739466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.739423 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" podStartSLOduration=1.739411612 podStartE2EDuration="1.739411612s" podCreationTimestamp="2026-04-21 00:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:11:10.738805544 +0000 UTC m=+492.352177066" watchObservedRunningTime="2026-04-21 00:11:10.739411612 +0000 UTC m=+492.352783133" Apr 21 00:11:10.991035 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:10.990955 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77680a27-bc95-43c8-9111-9de1f2265c92" path="/var/lib/kubelet/pods/77680a27-bc95-43c8-9111-9de1f2265c92/volumes" Apr 21 00:11:21.701442 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:21.701411 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:21.701794 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:21.701464 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:21.772075 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:21.772041 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:21.772386 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:21.772354 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" podUID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" containerName="manager" containerID="cri-o://bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b" gracePeriod=10 Apr 21 00:11:22.020153 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.020127 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:22.027141 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.027113 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:11:22.027489 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.027471 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" containerName="manager" Apr 21 00:11:22.027567 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.027492 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" containerName="manager" Apr 21 00:11:22.027633 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.027619 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" containerName="manager" Apr 21 00:11:22.029589 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.029568 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.042287 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.042263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:11:22.107156 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.107130 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctl59\" (UniqueName: \"kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59\") pod \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " Apr 21 00:11:22.107311 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.107187 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume\") pod \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\" (UID: \"e7c8fd86-bd30-4b77-bef3-be34d36d816e\") " Apr 21 00:11:22.107595 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.107571 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e7c8fd86-bd30-4b77-bef3-be34d36d816e" (UID: "e7c8fd86-bd30-4b77-bef3-be34d36d816e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 00:11:22.109205 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.109178 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59" (OuterVolumeSpecName: "kube-api-access-ctl59") pod "e7c8fd86-bd30-4b77-bef3-be34d36d816e" (UID: "e7c8fd86-bd30-4b77-bef3-be34d36d816e"). InnerVolumeSpecName "kube-api-access-ctl59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:11:22.208469 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.208438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptsk\" (UniqueName: \"kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.208585 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.208489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.208657 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.208604 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7c8fd86-bd30-4b77-bef3-be34d36d816e-extensions-socket-volume\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:22.208657 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.208632 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctl59\" (UniqueName: \"kubernetes.io/projected/e7c8fd86-bd30-4b77-bef3-be34d36d816e-kube-api-access-ctl59\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:22.309066 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.309039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptsk\" (UniqueName: \"kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.309198 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.309120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.309463 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.309444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.316721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.316695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptsk\" (UniqueName: \"kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4qrc6\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.339662 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.339639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.469975 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.469944 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:11:22.472699 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:11:22.472665 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6bdde0_e761_41d5_9996_766786548757.slice/crio-d4c7413ac3afbdc511ff00447fe901737b491fa5a3f2cac2cc2061430f700422 WatchSource:0}: Error finding container d4c7413ac3afbdc511ff00447fe901737b491fa5a3f2cac2cc2061430f700422: Status 404 returned error can't find the container with id d4c7413ac3afbdc511ff00447fe901737b491fa5a3f2cac2cc2061430f700422 Apr 21 00:11:22.738451 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.738351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" event={"ID":"0c6bdde0-e761-41d5-9996-766786548757","Type":"ContainerStarted","Data":"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c"} Apr 21 00:11:22.738451 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.738395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" event={"ID":"0c6bdde0-e761-41d5-9996-766786548757","Type":"ContainerStarted","Data":"d4c7413ac3afbdc511ff00447fe901737b491fa5a3f2cac2cc2061430f700422"} Apr 21 00:11:22.738903 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.738454 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:22.739546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.739519 2571 generic.go:358] "Generic (PLEG): container finished" podID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" containerID="bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b" exitCode=0 Apr 21 00:11:22.739648 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.739578 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" Apr 21 00:11:22.739648 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.739594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" event={"ID":"e7c8fd86-bd30-4b77-bef3-be34d36d816e","Type":"ContainerDied","Data":"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b"} Apr 21 00:11:22.739648 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.739621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw" event={"ID":"e7c8fd86-bd30-4b77-bef3-be34d36d816e","Type":"ContainerDied","Data":"d1cff0365533b18436f84cd888c4ede49a1139ef1bdffcf25bbd10a461bfc89b"} Apr 21 00:11:22.739648 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.739637 2571 scope.go:117] "RemoveContainer" containerID="bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b" Apr 21 00:11:22.748784 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.748757 2571 scope.go:117] "RemoveContainer" containerID="bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b" Apr 21 00:11:22.749082 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:11:22.749057 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b\": container with ID starting with bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b not found: ID does not exist" containerID="bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b" Apr 21 00:11:22.749185 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.749123 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b"} err="failed to get container status \"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b\": rpc error: code = NotFound desc = could not find container \"bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b\": container with ID starting with bb9037f64f9d9d0ab6cb2fbf75f5ec1f456d3953bd97cc710f8b5da7c8917b0b not found: ID does not exist" Apr 21 00:11:22.757852 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.757814 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" podStartSLOduration=0.757801344 podStartE2EDuration="757.801344ms" podCreationTimestamp="2026-04-21 00:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:11:22.755822688 +0000 UTC m=+504.369194212" watchObservedRunningTime="2026-04-21 00:11:22.757801344 +0000 UTC m=+504.371172864" Apr 21 00:11:22.768903 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.768877 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:22.772878 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.772855 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p2zpw"] Apr 21 00:11:22.991003 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:22.990920 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c8fd86-bd30-4b77-bef3-be34d36d816e" path="/var/lib/kubelet/pods/e7c8fd86-bd30-4b77-bef3-be34d36d816e/volumes" Apr 21 00:11:33.747140 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:33.747085 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:11:33.797573 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:33.797538 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:33.797793 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:33.797772 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" podUID="462348dc-4600-4201-a04e-af5503d0d1f7" containerName="manager" containerID="cri-o://7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a" gracePeriod=10 Apr 21 00:11:34.039392 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.039366 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:34.101524 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.101494 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld\") pod \"462348dc-4600-4201-a04e-af5503d0d1f7\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " Apr 21 00:11:34.101675 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.101573 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume\") pod \"462348dc-4600-4201-a04e-af5503d0d1f7\" (UID: \"462348dc-4600-4201-a04e-af5503d0d1f7\") " Apr 21 00:11:34.102984 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.102338 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "462348dc-4600-4201-a04e-af5503d0d1f7" (UID: "462348dc-4600-4201-a04e-af5503d0d1f7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 00:11:34.108362 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.108236 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld" (OuterVolumeSpecName: "kube-api-access-knrld") pod "462348dc-4600-4201-a04e-af5503d0d1f7" (UID: "462348dc-4600-4201-a04e-af5503d0d1f7"). InnerVolumeSpecName "kube-api-access-knrld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:11:34.202649 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.202625 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/462348dc-4600-4201-a04e-af5503d0d1f7-extensions-socket-volume\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:34.202649 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.202650 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/462348dc-4600-4201-a04e-af5503d0d1f7-kube-api-access-knrld\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:11:34.786617 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.786582 2571 generic.go:358] "Generic (PLEG): container finished" podID="462348dc-4600-4201-a04e-af5503d0d1f7" containerID="7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a" exitCode=0 Apr 21 00:11:34.787023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.786641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" event={"ID":"462348dc-4600-4201-a04e-af5503d0d1f7","Type":"ContainerDied","Data":"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a"} Apr 21 00:11:34.787023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.786659 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" Apr 21 00:11:34.787023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.786667 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4" event={"ID":"462348dc-4600-4201-a04e-af5503d0d1f7","Type":"ContainerDied","Data":"9a96f186a8cd75285bf21f0a2bd0c30ec04e9dbaaafe5f2f46ea0f4ff7051ea6"} Apr 21 00:11:34.787023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.786684 2571 scope.go:117] "RemoveContainer" containerID="7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a" Apr 21 00:11:34.794736 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.794714 2571 scope.go:117] "RemoveContainer" containerID="7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a" Apr 21 00:11:34.794979 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:11:34.794955 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a\": container with ID starting with 7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a not found: ID does not exist" containerID="7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a" Apr 21 00:11:34.795024 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.794991 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a"} err="failed to get container status \"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a\": rpc error: code = NotFound desc = could not find container \"7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a\": container with ID starting with 7c417e76ab59c3597869bce24115a094a18700566ad6c7a86d7b3a13df34154a not found: ID does not exist" Apr 21 00:11:34.807201 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.807164 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:34.817374 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.817349 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lqgp4"] Apr 21 00:11:34.990635 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:34.990605 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462348dc-4600-4201-a04e-af5503d0d1f7" path="/var/lib/kubelet/pods/462348dc-4600-4201-a04e-af5503d0d1f7/volumes" Apr 21 00:11:56.261182 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.261145 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:11:56.261566 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.261498 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="462348dc-4600-4201-a04e-af5503d0d1f7" containerName="manager" Apr 21 00:11:56.261566 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.261512 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="462348dc-4600-4201-a04e-af5503d0d1f7" containerName="manager" Apr 21 00:11:56.261696 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.261573 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="462348dc-4600-4201-a04e-af5503d0d1f7" containerName="manager" Apr 21 00:11:56.264774 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.264754 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.267355 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.267332 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 00:11:56.267473 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.267415 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pdvtf\"" Apr 21 00:11:56.272155 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.272122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:11:56.299145 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.299111 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:11:56.366271 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.366243 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0f61a0ed-6b8e-488e-85be-e8868739e0e1-config-file\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.366402 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.366282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jcpk\" (UniqueName: \"kubernetes.io/projected/0f61a0ed-6b8e-488e-85be-e8868739e0e1-kube-api-access-2jcpk\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.466955 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.466923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcpk\" (UniqueName: \"kubernetes.io/projected/0f61a0ed-6b8e-488e-85be-e8868739e0e1-kube-api-access-2jcpk\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.467139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.467015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0f61a0ed-6b8e-488e-85be-e8868739e0e1-config-file\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.467628 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.467608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0f61a0ed-6b8e-488e-85be-e8868739e0e1-config-file\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.474788 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.474770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcpk\" (UniqueName: \"kubernetes.io/projected/0f61a0ed-6b8e-488e-85be-e8868739e0e1-kube-api-access-2jcpk\") pod \"limitador-limitador-78c99df468-m4tp9\" (UID: \"0f61a0ed-6b8e-488e-85be-e8868739e0e1\") " pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.576550 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.576479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:56.701026 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.700893 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:11:56.703850 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:11:56.703820 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f61a0ed_6b8e_488e_85be_e8868739e0e1.slice/crio-f1e6008b18200ca7b25267d5b6c3bff3950611231491cab3e29ade18b0ebaedb WatchSource:0}: Error finding container f1e6008b18200ca7b25267d5b6c3bff3950611231491cab3e29ade18b0ebaedb: Status 404 returned error can't find the container with id f1e6008b18200ca7b25267d5b6c3bff3950611231491cab3e29ade18b0ebaedb Apr 21 00:11:56.867053 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:56.866967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" event={"ID":"0f61a0ed-6b8e-488e-85be-e8868739e0e1","Type":"ContainerStarted","Data":"f1e6008b18200ca7b25267d5b6c3bff3950611231491cab3e29ade18b0ebaedb"} Apr 21 00:11:59.879072 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:59.879040 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" event={"ID":"0f61a0ed-6b8e-488e-85be-e8868739e0e1","Type":"ContainerStarted","Data":"9d9426c0f898e95c874a4840ad1543f9b090218407079b626d5c194c90ddb069"} Apr 21 00:11:59.879463 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:59.879142 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:11:59.897845 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:11:59.895606 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" podStartSLOduration=1.45896637 podStartE2EDuration="3.895591578s" podCreationTimestamp="2026-04-21 00:11:56 +0000 UTC" firstStartedPulling="2026-04-21 00:11:56.706003384 +0000 UTC m=+538.319374887" lastFinishedPulling="2026-04-21 00:11:59.142628577 +0000 UTC m=+540.756000095" observedRunningTime="2026-04-21 00:11:59.8929737 +0000 UTC m=+541.506345224" watchObservedRunningTime="2026-04-21 00:11:59.895591578 +0000 UTC m=+541.508963100" Apr 21 00:12:10.883238 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:12:10.883202 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-m4tp9" Apr 21 00:12:58.911076 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:12:58.911042 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:12:58.911935 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:12:58.911915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:13:26.110010 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.109935 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:26.112306 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.112286 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:26.114841 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.114820 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jrw5v\"" Apr 21 00:13:26.122254 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.122234 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:26.215516 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.215485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfgm\" (UniqueName: \"kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm\") pod \"maas-controller-79db9458b-ftqqv\" (UID: \"34131584-358e-4eae-a8f6-f81012b8f60e\") " pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:26.316167 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.316130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfgm\" (UniqueName: \"kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm\") pod \"maas-controller-79db9458b-ftqqv\" (UID: \"34131584-358e-4eae-a8f6-f81012b8f60e\") " pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:26.323892 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.323861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfgm\" (UniqueName: \"kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm\") pod \"maas-controller-79db9458b-ftqqv\" (UID: \"34131584-358e-4eae-a8f6-f81012b8f60e\") " pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:26.422822 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.422747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:26.541425 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:26.541313 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:26.543994 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:13:26.543958 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34131584_358e_4eae_a8f6_f81012b8f60e.slice/crio-4ea8fea6455c31f69cb296403b9bc2f9d9697dc1c1282fbce31d0137651b52e1 WatchSource:0}: Error finding container 4ea8fea6455c31f69cb296403b9bc2f9d9697dc1c1282fbce31d0137651b52e1: Status 404 returned error can't find the container with id 4ea8fea6455c31f69cb296403b9bc2f9d9697dc1c1282fbce31d0137651b52e1 Apr 21 00:13:27.173999 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:27.173955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79db9458b-ftqqv" event={"ID":"34131584-358e-4eae-a8f6-f81012b8f60e","Type":"ContainerStarted","Data":"4ea8fea6455c31f69cb296403b9bc2f9d9697dc1c1282fbce31d0137651b52e1"} Apr 21 00:13:29.183544 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:29.183461 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79db9458b-ftqqv" event={"ID":"34131584-358e-4eae-a8f6-f81012b8f60e","Type":"ContainerStarted","Data":"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0"} Apr 21 00:13:29.183897 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:29.183581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:29.200158 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:29.200112 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-79db9458b-ftqqv" podStartSLOduration=0.837965705 podStartE2EDuration="3.200086094s" podCreationTimestamp="2026-04-21 00:13:26 +0000 UTC" firstStartedPulling="2026-04-21 00:13:26.547620971 +0000 UTC m=+628.160992472" lastFinishedPulling="2026-04-21 00:13:28.909741363 +0000 UTC m=+630.523112861" observedRunningTime="2026-04-21 00:13:29.19917287 +0000 UTC m=+630.812544391" watchObservedRunningTime="2026-04-21 00:13:29.200086094 +0000 UTC m=+630.813457616" Apr 21 00:13:31.359669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:31.359634 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:13:40.193464 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.193428 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:40.531552 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.531521 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:13:40.535066 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.535050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:40.541337 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.541216 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:13:40.631846 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.631810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhv8t\" (UniqueName: \"kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t\") pod \"maas-controller-779779f6b6-kl45c\" (UID: \"8e5300b1-3773-4c9e-805f-23d4dc12e755\") " pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:40.732476 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.732445 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhv8t\" (UniqueName: \"kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t\") pod \"maas-controller-779779f6b6-kl45c\" (UID: \"8e5300b1-3773-4c9e-805f-23d4dc12e755\") " pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:40.740135 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.740088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhv8t\" (UniqueName: \"kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t\") pod \"maas-controller-779779f6b6-kl45c\" (UID: \"8e5300b1-3773-4c9e-805f-23d4dc12e755\") " pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:40.847825 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.847756 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:40.966022 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:40.965991 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:13:40.968592 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:13:40.968560 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5300b1_3773_4c9e_805f_23d4dc12e755.slice/crio-7b121b46eb0134e786e0a0772d8f6cb376f132eb9c7face30bf0a1a6d4653914 WatchSource:0}: Error finding container 7b121b46eb0134e786e0a0772d8f6cb376f132eb9c7face30bf0a1a6d4653914: Status 404 returned error can't find the container with id 7b121b46eb0134e786e0a0772d8f6cb376f132eb9c7face30bf0a1a6d4653914 Apr 21 00:13:41.226774 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:41.226686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-kl45c" event={"ID":"8e5300b1-3773-4c9e-805f-23d4dc12e755","Type":"ContainerStarted","Data":"7b121b46eb0134e786e0a0772d8f6cb376f132eb9c7face30bf0a1a6d4653914"} Apr 21 00:13:42.232072 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:42.232031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-kl45c" event={"ID":"8e5300b1-3773-4c9e-805f-23d4dc12e755","Type":"ContainerStarted","Data":"410c9977b89fba27846cd6988a1601858d9b9da27926bc6c871cce3008d0a48b"} Apr 21 00:13:42.232522 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:42.232150 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:42.246394 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:42.246345 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-779779f6b6-kl45c" podStartSLOduration=1.864070704 podStartE2EDuration="2.246333918s" podCreationTimestamp="2026-04-21 00:13:40 +0000 UTC" firstStartedPulling="2026-04-21 00:13:40.969847008 +0000 UTC m=+642.583218508" lastFinishedPulling="2026-04-21 00:13:41.352110208 +0000 UTC m=+642.965481722" observedRunningTime="2026-04-21 00:13:42.245003883 +0000 UTC m=+643.858375409" watchObservedRunningTime="2026-04-21 00:13:42.246333918 +0000 UTC m=+643.859705439" Apr 21 00:13:53.240593 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.240558 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:13:53.279648 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.279614 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:53.279845 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.279824 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-79db9458b-ftqqv" podUID="34131584-358e-4eae-a8f6-f81012b8f60e" containerName="manager" containerID="cri-o://84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0" gracePeriod=10 Apr 21 00:13:53.522209 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.522184 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:53.624154 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.624123 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfgm\" (UniqueName: \"kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm\") pod \"34131584-358e-4eae-a8f6-f81012b8f60e\" (UID: \"34131584-358e-4eae-a8f6-f81012b8f60e\") " Apr 21 00:13:53.626291 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.626262 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm" (OuterVolumeSpecName: "kube-api-access-nkfgm") pod "34131584-358e-4eae-a8f6-f81012b8f60e" (UID: "34131584-358e-4eae-a8f6-f81012b8f60e"). InnerVolumeSpecName "kube-api-access-nkfgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:13:53.724943 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:53.724910 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkfgm\" (UniqueName: \"kubernetes.io/projected/34131584-358e-4eae-a8f6-f81012b8f60e-kube-api-access-nkfgm\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:13:54.275941 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.275900 2571 generic.go:358] "Generic (PLEG): container finished" podID="34131584-358e-4eae-a8f6-f81012b8f60e" containerID="84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0" exitCode=0 Apr 21 00:13:54.276371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.275969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79db9458b-ftqqv" event={"ID":"34131584-358e-4eae-a8f6-f81012b8f60e","Type":"ContainerDied","Data":"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0"} Apr 21 00:13:54.276371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.275995 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79db9458b-ftqqv" Apr 21 00:13:54.276371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.276003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79db9458b-ftqqv" event={"ID":"34131584-358e-4eae-a8f6-f81012b8f60e","Type":"ContainerDied","Data":"4ea8fea6455c31f69cb296403b9bc2f9d9697dc1c1282fbce31d0137651b52e1"} Apr 21 00:13:54.276371 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.276023 2571 scope.go:117] "RemoveContainer" containerID="84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0" Apr 21 00:13:54.284460 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.284271 2571 scope.go:117] "RemoveContainer" containerID="84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0" Apr 21 00:13:54.284530 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:13:54.284513 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0\": container with ID starting with 84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0 not found: ID does not exist" containerID="84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0" Apr 21 00:13:54.284573 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.284538 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0"} err="failed to get container status \"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0\": rpc error: code = NotFound desc = could not find container \"84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0\": container with ID starting with 84a774dbd3e55e623857c9d3fb237ba57cdc0c6664aeb7cb07a506ce1f9ffcf0 not found: ID does not exist" Apr 21 00:13:54.297170 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.297142 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:54.298558 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.298537 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-79db9458b-ftqqv"] Apr 21 00:13:54.990479 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:13:54.990449 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34131584-358e-4eae-a8f6-f81012b8f60e" path="/var/lib/kubelet/pods/34131584-358e-4eae-a8f6-f81012b8f60e/volumes" Apr 21 00:14:48.066627 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.066537 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:14:48.189476 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.189439 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f"] Apr 21 00:14:48.189794 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.189782 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34131584-358e-4eae-a8f6-f81012b8f60e" containerName="manager" Apr 21 00:14:48.189838 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.189797 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="34131584-358e-4eae-a8f6-f81012b8f60e" containerName="manager" Apr 21 00:14:48.189872 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.189865 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="34131584-358e-4eae-a8f6-f81012b8f60e" containerName="manager" Apr 21 00:14:48.191879 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.191862 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.195554 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.195532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lw7xf\"" Apr 21 00:14:48.195743 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.195597 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 00:14:48.195871 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.195600 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 00:14:48.195871 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.195646 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 00:14:48.200357 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.200335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f"] Apr 21 00:14:48.297071 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.297235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297080 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.297235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.297235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.297336 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.297336 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.297315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjxb\" (UniqueName: \"kubernetes.io/projected/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kube-api-access-2jjxb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.398769 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.398769 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.398769 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399033 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399033 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjxb\" (UniqueName: \"kubernetes.io/projected/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kube-api-access-2jjxb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399033 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.398843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399219 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.399175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399298 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.399260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.399408 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.399382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.401137 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.401110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.401242 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.401222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.405844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.405822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjxb\" (UniqueName: \"kubernetes.io/projected/b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b-kube-api-access-2jjxb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f\" (UID: \"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.503458 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.503413 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:14:48.633221 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.633189 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f"] Apr 21 00:14:48.634895 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:14:48.634865 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96b337d_4ff4_4a8b_8bba_1c7fb5534d5b.slice/crio-3dd7b331452d000ed8395e302ec7a930f36a8792e09ad07cb0b7cf32a0f913d4 WatchSource:0}: Error finding container 3dd7b331452d000ed8395e302ec7a930f36a8792e09ad07cb0b7cf32a0f913d4: Status 404 returned error can't find the container with id 3dd7b331452d000ed8395e302ec7a930f36a8792e09ad07cb0b7cf32a0f913d4 Apr 21 00:14:48.636689 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:48.636671 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 00:14:49.103871 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:49.101276 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:14:49.476555 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:49.476469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" event={"ID":"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b","Type":"ContainerStarted","Data":"3dd7b331452d000ed8395e302ec7a930f36a8792e09ad07cb0b7cf32a0f913d4"} Apr 21 00:14:54.701708 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:54.701666 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:14:54.983184 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:54.983155 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5"] Apr 21 00:14:54.985901 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:54.985880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:54.988749 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:54.988714 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 00:14:54.995811 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:54.995762 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5"] Apr 21 00:14:55.066187 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.066187 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/408601c5-4214-43f1-b7ef-7271b1f9bd67-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.066428 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.066428 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.066428 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwm9p\" (UniqueName: \"kubernetes.io/projected/408601c5-4214-43f1-b7ef-7271b1f9bd67-kube-api-access-nwm9p\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.066594 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.066431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/408601c5-4214-43f1-b7ef-7271b1f9bd67-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167316 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167511 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwm9p\" (UniqueName: \"kubernetes.io/projected/408601c5-4214-43f1-b7ef-7271b1f9bd67-kube-api-access-nwm9p\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167511 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167511 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167400 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167731 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.167903 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.167883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.170498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.170473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/408601c5-4214-43f1-b7ef-7271b1f9bd67-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.170844 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.170826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/408601c5-4214-43f1-b7ef-7271b1f9bd67-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.175976 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.175924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwm9p\" (UniqueName: \"kubernetes.io/projected/408601c5-4214-43f1-b7ef-7271b1f9bd67-kube-api-access-nwm9p\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5\" (UID: \"408601c5-4214-43f1-b7ef-7271b1f9bd67\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.303111 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.303054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:14:55.454230 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.454173 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5"] Apr 21 00:14:55.458265 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:14:55.458224 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408601c5_4214_43f1_b7ef_7271b1f9bd67.slice/crio-a4e134c811737ffbc08f4e848e3807512784f0b85fa972edb8dba8442db95072 WatchSource:0}: Error finding container a4e134c811737ffbc08f4e848e3807512784f0b85fa972edb8dba8442db95072: Status 404 returned error can't find the container with id a4e134c811737ffbc08f4e848e3807512784f0b85fa972edb8dba8442db95072 Apr 21 00:14:55.501565 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.501533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" event={"ID":"408601c5-4214-43f1-b7ef-7271b1f9bd67","Type":"ContainerStarted","Data":"a4e134c811737ffbc08f4e848e3807512784f0b85fa972edb8dba8442db95072"} Apr 21 00:14:55.503015 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:55.502967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" event={"ID":"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b","Type":"ContainerStarted","Data":"7ecf51b104f1620e9d0007e40534f774f0a2ce8251ac0b973725ae73eb5e1d8d"} Apr 21 00:14:56.508218 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:56.508172 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" event={"ID":"408601c5-4214-43f1-b7ef-7271b1f9bd67","Type":"ContainerStarted","Data":"c3e23394fa7de3f8bfb0f6b415c10b46be72c13c58cdb3a38775e9cef1f9d91c"} Apr 21 00:14:59.698276 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:14:59.698240 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:15:00.127571 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.127527 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:15:00.131176 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.131156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:15:00.133511 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.133485 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-77wmh\"" Apr 21 00:15:00.139613 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.138533 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:15:00.210080 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.210020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2pfw\" (UniqueName: \"kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw\") pod \"maas-api-key-cleanup-29612175-wnwlr\" (UID: \"a21d3014-2d32-4ecf-a616-8231deaf331e\") " pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:15:00.311361 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.311317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2pfw\" (UniqueName: \"kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw\") pod \"maas-api-key-cleanup-29612175-wnwlr\" (UID: \"a21d3014-2d32-4ecf-a616-8231deaf331e\") " pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:15:00.319029 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.319000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2pfw\" (UniqueName: \"kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw\") pod \"maas-api-key-cleanup-29612175-wnwlr\" (UID: \"a21d3014-2d32-4ecf-a616-8231deaf331e\") " pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:15:00.459147 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.459034 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:15:00.525774 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.525733 2571 generic.go:358] "Generic (PLEG): container finished" podID="b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b" containerID="7ecf51b104f1620e9d0007e40534f774f0a2ce8251ac0b973725ae73eb5e1d8d" exitCode=0 Apr 21 00:15:00.525917 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.525794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" event={"ID":"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b","Type":"ContainerDied","Data":"7ecf51b104f1620e9d0007e40534f774f0a2ce8251ac0b973725ae73eb5e1d8d"} Apr 21 00:15:00.598034 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:00.597857 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:15:00.601355 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:15:00.601329 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21d3014_2d32_4ecf_a616_8231deaf331e.slice/crio-5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535 WatchSource:0}: Error finding container 5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535: Status 404 returned error can't find the container with id 5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535 Apr 21 00:15:01.530711 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:01.530674 2571 generic.go:358] "Generic (PLEG): container finished" podID="408601c5-4214-43f1-b7ef-7271b1f9bd67" containerID="c3e23394fa7de3f8bfb0f6b415c10b46be72c13c58cdb3a38775e9cef1f9d91c" exitCode=0 Apr 21 00:15:01.531226 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:01.530733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" event={"ID":"408601c5-4214-43f1-b7ef-7271b1f9bd67","Type":"ContainerDied","Data":"c3e23394fa7de3f8bfb0f6b415c10b46be72c13c58cdb3a38775e9cef1f9d91c"} Apr 21 00:15:01.532244 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:01.532218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerStarted","Data":"5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535"} Apr 21 00:15:02.536949 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.536911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" event={"ID":"408601c5-4214-43f1-b7ef-7271b1f9bd67","Type":"ContainerStarted","Data":"f262ef03634fbc81f327f5fffef477a733fd5c79130df50bfddf9f37bb8909a6"} Apr 21 00:15:02.537376 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.537141 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:15:02.538520 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.538493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" event={"ID":"b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b","Type":"ContainerStarted","Data":"f7acaf50cc16cccb9851b99402f3bf8095539f5baa1aca949c86769484aef007"} Apr 21 00:15:02.538709 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.538694 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:15:02.539712 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.539691 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerStarted","Data":"f91ba67da45611ad7839eff5d92dfb793032173543e216f05d01cf61056522f0"} Apr 21 00:15:02.554272 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.554229 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" podStartSLOduration=8.352185838 podStartE2EDuration="8.554216027s" podCreationTimestamp="2026-04-21 00:14:54 +0000 UTC" firstStartedPulling="2026-04-21 00:15:01.531534303 +0000 UTC m=+723.144905802" lastFinishedPulling="2026-04-21 00:15:01.73356449 +0000 UTC m=+723.346935991" observedRunningTime="2026-04-21 00:15:02.55251737 +0000 UTC m=+724.165888892" watchObservedRunningTime="2026-04-21 00:15:02.554216027 +0000 UTC m=+724.167587541" Apr 21 00:15:02.566230 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.566195 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" podStartSLOduration=1.437036065 podStartE2EDuration="2.566183229s" podCreationTimestamp="2026-04-21 00:15:00 +0000 UTC" firstStartedPulling="2026-04-21 00:15:00.603300109 +0000 UTC m=+722.216671608" lastFinishedPulling="2026-04-21 00:15:01.732447269 +0000 UTC m=+723.345818772" observedRunningTime="2026-04-21 00:15:02.564499659 +0000 UTC m=+724.177871180" watchObservedRunningTime="2026-04-21 00:15:02.566183229 +0000 UTC m=+724.179554741" Apr 21 00:15:02.580636 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:02.580596 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" podStartSLOduration=1.484485592 podStartE2EDuration="14.58058485s" podCreationTimestamp="2026-04-21 00:14:48 +0000 UTC" firstStartedPulling="2026-04-21 00:14:48.636800763 +0000 UTC m=+710.250172262" lastFinishedPulling="2026-04-21 00:15:01.732900022 +0000 UTC m=+723.346271520" observedRunningTime="2026-04-21 00:15:02.579498742 +0000 UTC m=+724.192870263" watchObservedRunningTime="2026-04-21 00:15:02.58058485 +0000 UTC m=+724.193956370" Apr 21 00:15:13.560666 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:13.560633 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f" Apr 21 00:15:13.561946 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:13.561923 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5" Apr 21 00:15:22.609742 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:22.609715 2571 generic.go:358] "Generic (PLEG): container finished" podID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerID="f91ba67da45611ad7839eff5d92dfb793032173543e216f05d01cf61056522f0" exitCode=6 Apr 21 00:15:22.610148 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:22.609795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerDied","Data":"f91ba67da45611ad7839eff5d92dfb793032173543e216f05d01cf61056522f0"} Apr 21 00:15:22.610232 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:22.610214 2571 scope.go:117] "RemoveContainer" containerID="f91ba67da45611ad7839eff5d92dfb793032173543e216f05d01cf61056522f0" Apr 21 00:15:23.614854 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:23.614819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerStarted","Data":"6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604"} Apr 21 00:15:27.403865 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:27.403826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:15:31.695798 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:31.695763 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:15:43.690578 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:43.690502 2571 generic.go:358] "Generic (PLEG): container finished" podID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerID="6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604" exitCode=6 Apr 21 00:15:43.690578 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:43.690553 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerDied","Data":"6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604"} Apr 21 00:15:43.691005 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:43.690586 2571 scope.go:117] "RemoveContainer" containerID="f91ba67da45611ad7839eff5d92dfb793032173543e216f05d01cf61056522f0" Apr 21 00:15:43.691005 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:43.690867 2571 scope.go:117] "RemoveContainer" containerID="6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604" Apr 21 00:15:43.691132 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:15:43.691071 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612175-wnwlr_opendatahub(a21d3014-2d32-4ecf-a616-8231deaf331e)\"" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" Apr 21 00:15:56.986139 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:56.986079 2571 scope.go:117] "RemoveContainer" containerID="6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604" Apr 21 00:15:57.743005 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:57.742966 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerStarted","Data":"a5292696ce745b1fac55efc1867bc8f3fcd86e3433367024b482ee9e08022663"} Apr 21 00:15:58.010222 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:58.010189 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:15:58.746790 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:15:58.746751 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" containerID="cri-o://a5292696ce745b1fac55efc1867bc8f3fcd86e3433367024b482ee9e08022663" gracePeriod=30 Apr 21 00:16:17.770369 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:16:17.770331 2571 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21d3014_2d32_4ecf_a616_8231deaf331e.slice/crio-5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535\": RecentStats: unable to find data in memory cache]" Apr 21 00:16:17.816350 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.816159 2571 generic.go:358] "Generic (PLEG): container finished" podID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerID="a5292696ce745b1fac55efc1867bc8f3fcd86e3433367024b482ee9e08022663" exitCode=6 Apr 21 00:16:17.816350 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.816218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerDied","Data":"a5292696ce745b1fac55efc1867bc8f3fcd86e3433367024b482ee9e08022663"} Apr 21 00:16:17.816350 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.816255 2571 scope.go:117] "RemoveContainer" containerID="6b1041a53d35d2efa1dba71a8a2d282856907deaecf325bb9c99dc9cd1dae604" Apr 21 00:16:17.893779 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.893756 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:16:17.994383 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.994298 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2pfw\" (UniqueName: \"kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw\") pod \"a21d3014-2d32-4ecf-a616-8231deaf331e\" (UID: \"a21d3014-2d32-4ecf-a616-8231deaf331e\") " Apr 21 00:16:17.996267 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:17.996242 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw" (OuterVolumeSpecName: "kube-api-access-q2pfw") pod "a21d3014-2d32-4ecf-a616-8231deaf331e" (UID: "a21d3014-2d32-4ecf-a616-8231deaf331e"). InnerVolumeSpecName "kube-api-access-q2pfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:16:18.095918 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.095885 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2pfw\" (UniqueName: \"kubernetes.io/projected/a21d3014-2d32-4ecf-a616-8231deaf331e-kube-api-access-q2pfw\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:16:18.820874 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.820834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" event={"ID":"a21d3014-2d32-4ecf-a616-8231deaf331e","Type":"ContainerDied","Data":"5535f43975c8d138403933257c0b7db53f152c58ae91ac59e07a3c3632d1c535"} Apr 21 00:16:18.821310 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.820888 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612175-wnwlr" Apr 21 00:16:18.821310 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.820888 2571 scope.go:117] "RemoveContainer" containerID="a5292696ce745b1fac55efc1867bc8f3fcd86e3433367024b482ee9e08022663" Apr 21 00:16:18.841194 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.841163 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:16:18.842837 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.842814 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612175-wnwlr"] Apr 21 00:16:18.991021 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:18.990992 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" path="/var/lib/kubelet/pods/a21d3014-2d32-4ecf-a616-8231deaf331e/volumes" Apr 21 00:16:35.901348 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:35.901308 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:16:40.796546 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:40.796510 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:16:47.294664 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:47.294621 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:16:57.897260 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:16:57.897223 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:17:06.098395 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:06.098354 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:17:16.298247 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:16.298215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:17:26.011108 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:26.011067 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:17:36.695598 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:36.695566 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:17:42.971391 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:42.971360 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:17:42.971779 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:42.971577 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-779779f6b6-kl45c" podUID="8e5300b1-3773-4c9e-805f-23d4dc12e755" containerName="manager" containerID="cri-o://410c9977b89fba27846cd6988a1601858d9b9da27926bc6c871cce3008d0a48b" gracePeriod=10 Apr 21 00:17:43.111230 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.111190 2571 generic.go:358] "Generic (PLEG): container finished" podID="8e5300b1-3773-4c9e-805f-23d4dc12e755" containerID="410c9977b89fba27846cd6988a1601858d9b9da27926bc6c871cce3008d0a48b" exitCode=0 Apr 21 00:17:43.111375 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.111264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-kl45c" event={"ID":"8e5300b1-3773-4c9e-805f-23d4dc12e755","Type":"ContainerDied","Data":"410c9977b89fba27846cd6988a1601858d9b9da27926bc6c871cce3008d0a48b"} Apr 21 00:17:43.218726 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.218701 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:17:43.356846 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.356809 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhv8t\" (UniqueName: \"kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t\") pod \"8e5300b1-3773-4c9e-805f-23d4dc12e755\" (UID: \"8e5300b1-3773-4c9e-805f-23d4dc12e755\") " Apr 21 00:17:43.358891 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.358860 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t" (OuterVolumeSpecName: "kube-api-access-fhv8t") pod "8e5300b1-3773-4c9e-805f-23d4dc12e755" (UID: "8e5300b1-3773-4c9e-805f-23d4dc12e755"). InnerVolumeSpecName "kube-api-access-fhv8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:17:43.458363 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:43.458328 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhv8t\" (UniqueName: \"kubernetes.io/projected/8e5300b1-3773-4c9e-805f-23d4dc12e755-kube-api-access-fhv8t\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:17:44.115861 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.115827 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-kl45c" event={"ID":"8e5300b1-3773-4c9e-805f-23d4dc12e755","Type":"ContainerDied","Data":"7b121b46eb0134e786e0a0772d8f6cb376f132eb9c7face30bf0a1a6d4653914"} Apr 21 00:17:44.116252 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.115866 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-kl45c" Apr 21 00:17:44.116252 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.115869 2571 scope.go:117] "RemoveContainer" containerID="410c9977b89fba27846cd6988a1601858d9b9da27926bc6c871cce3008d0a48b" Apr 21 00:17:44.136131 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.136104 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:17:44.139354 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.139335 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-779779f6b6-kl45c"] Apr 21 00:17:44.857395 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857359 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-779779f6b6-nkvzw"] Apr 21 00:17:44.857703 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857692 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857705 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857714 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5300b1-3773-4c9e-805f-23d4dc12e755" containerName="manager" Apr 21 00:17:44.857747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5300b1-3773-4c9e-805f-23d4dc12e755" containerName="manager" Apr 21 00:17:44.857747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857738 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857747 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857744 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857755 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857760 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857821 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5300b1-3773-4c9e-805f-23d4dc12e755" containerName="manager" Apr 21 00:17:44.857893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857831 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.857893 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.857839 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:17:44.860518 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.860499 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:44.862835 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.862814 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jrw5v\"" Apr 21 00:17:44.867985 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.867964 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-779779f6b6-nkvzw"] Apr 21 00:17:44.971864 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.971834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlc6v\" (UniqueName: \"kubernetes.io/projected/97ff5bbb-4b30-46d1-b6f4-3319bbdf0757-kube-api-access-rlc6v\") pod \"maas-controller-779779f6b6-nkvzw\" (UID: \"97ff5bbb-4b30-46d1-b6f4-3319bbdf0757\") " pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:44.993436 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:44.993406 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5300b1-3773-4c9e-805f-23d4dc12e755" path="/var/lib/kubelet/pods/8e5300b1-3773-4c9e-805f-23d4dc12e755/volumes" Apr 21 00:17:45.072993 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:45.072956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlc6v\" (UniqueName: \"kubernetes.io/projected/97ff5bbb-4b30-46d1-b6f4-3319bbdf0757-kube-api-access-rlc6v\") pod \"maas-controller-779779f6b6-nkvzw\" (UID: \"97ff5bbb-4b30-46d1-b6f4-3319bbdf0757\") " pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:45.080901 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:45.080869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlc6v\" (UniqueName: \"kubernetes.io/projected/97ff5bbb-4b30-46d1-b6f4-3319bbdf0757-kube-api-access-rlc6v\") pod \"maas-controller-779779f6b6-nkvzw\" (UID: \"97ff5bbb-4b30-46d1-b6f4-3319bbdf0757\") " pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:45.172159 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:45.172058 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:45.294946 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:45.294914 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-779779f6b6-nkvzw"] Apr 21 00:17:45.297817 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:17:45.297791 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ff5bbb_4b30_46d1_b6f4_3319bbdf0757.slice/crio-320a9e8c3c04f1e905ad3c2929f9f0eaa132426d58328472edc172ea59580fa0 WatchSource:0}: Error finding container 320a9e8c3c04f1e905ad3c2929f9f0eaa132426d58328472edc172ea59580fa0: Status 404 returned error can't find the container with id 320a9e8c3c04f1e905ad3c2929f9f0eaa132426d58328472edc172ea59580fa0 Apr 21 00:17:46.126192 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:46.126079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-nkvzw" event={"ID":"97ff5bbb-4b30-46d1-b6f4-3319bbdf0757","Type":"ContainerStarted","Data":"76bcf34284f8ca652b2c62d4c32c27748adb6295e855867517df4dd292497b32"} Apr 21 00:17:46.126192 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:46.126146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-779779f6b6-nkvzw" event={"ID":"97ff5bbb-4b30-46d1-b6f4-3319bbdf0757","Type":"ContainerStarted","Data":"320a9e8c3c04f1e905ad3c2929f9f0eaa132426d58328472edc172ea59580fa0"} Apr 21 00:17:46.126401 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:46.126232 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:46.142608 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:46.142549 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-779779f6b6-nkvzw" podStartSLOduration=1.65964349 podStartE2EDuration="2.142531797s" podCreationTimestamp="2026-04-21 00:17:44 +0000 UTC" firstStartedPulling="2026-04-21 00:17:45.299443692 +0000 UTC m=+886.912815190" lastFinishedPulling="2026-04-21 00:17:45.782331998 +0000 UTC m=+887.395703497" observedRunningTime="2026-04-21 00:17:46.140446359 +0000 UTC m=+887.753817905" watchObservedRunningTime="2026-04-21 00:17:46.142531797 +0000 UTC m=+887.755903319" Apr 21 00:17:57.134723 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:57.134690 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-779779f6b6-nkvzw" Apr 21 00:17:58.935549 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:58.935518 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:17:58.936643 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:17:58.936619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:18:37.894305 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:18:37.894265 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:18:53.402026 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:18:53.401989 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:19:32.199020 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:19:32.198942 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:19:49.308293 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:19:49.308261 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:20:03.903416 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:20:03.903379 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:20:19.097133 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:20:19.097083 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:21:13.493715 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:21:13.493676 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:21:22.201855 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:21:22.201822 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:21:38.991724 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:21:38.991691 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:21:47.491967 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:21:47.491934 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:22:05.701945 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:05.701901 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:22:13.196487 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:13.196447 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:22:46.693404 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:46.693321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:22:54.896559 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:54.896516 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:22:58.960707 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:58.960679 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:22:58.963646 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:22:58.963621 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:23:03.495310 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:23:03.495269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:23:12.498426 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:23:12.498385 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:23:20.798899 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:23:20.798862 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:23:36.995966 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:23:36.995933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:23:48.295070 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:23:48.294989 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:24:34.703247 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:24:34.703206 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:24:42.894675 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:24:42.894639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:24:52.601134 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:24:52.601083 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:00.300962 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:00.300915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:09.396562 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:09.396526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:18.497542 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:18.497451 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:27.900107 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:27.900061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:35.490532 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:35.490490 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:45.298884 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:45.298840 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:25:53.505170 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:25:53.505131 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:02.694505 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:02.694467 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:11.499533 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:11.499497 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:19.396788 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:19.396753 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:28.697907 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:28.697868 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:37.299679 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:37.299642 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:45.899059 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:45.899020 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:26:55.099379 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:26:55.099341 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:27:03.006732 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:03.006695 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:27:54.632163 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:54.632130 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:27:54.632697 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:54.632385 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" podUID="0c6bdde0-e761-41d5-9996-766786548757" containerName="manager" containerID="cri-o://2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c" gracePeriod=10 Apr 21 00:27:54.995855 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:54.995832 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:27:55.022721 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.022693 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ptsk\" (UniqueName: \"kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk\") pod \"0c6bdde0-e761-41d5-9996-766786548757\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " Apr 21 00:27:55.022871 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.022745 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume\") pod \"0c6bdde0-e761-41d5-9996-766786548757\" (UID: \"0c6bdde0-e761-41d5-9996-766786548757\") " Apr 21 00:27:55.023236 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.023204 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0c6bdde0-e761-41d5-9996-766786548757" (UID: "0c6bdde0-e761-41d5-9996-766786548757"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 00:27:55.025054 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.025016 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk" (OuterVolumeSpecName: "kube-api-access-6ptsk") pod "0c6bdde0-e761-41d5-9996-766786548757" (UID: "0c6bdde0-e761-41d5-9996-766786548757"). InnerVolumeSpecName "kube-api-access-6ptsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:27:55.123695 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.123666 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ptsk\" (UniqueName: \"kubernetes.io/projected/0c6bdde0-e761-41d5-9996-766786548757-kube-api-access-6ptsk\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:27:55.123695 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.123692 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c6bdde0-e761-41d5-9996-766786548757-extensions-socket-volume\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:27:55.278021 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.277989 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c6bdde0-e761-41d5-9996-766786548757" containerID="2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c" exitCode=0 Apr 21 00:27:55.278223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.278046 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" Apr 21 00:27:55.278223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.278077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" event={"ID":"0c6bdde0-e761-41d5-9996-766786548757","Type":"ContainerDied","Data":"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c"} Apr 21 00:27:55.278223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.278141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6" event={"ID":"0c6bdde0-e761-41d5-9996-766786548757","Type":"ContainerDied","Data":"d4c7413ac3afbdc511ff00447fe901737b491fa5a3f2cac2cc2061430f700422"} Apr 21 00:27:55.278223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.278159 2571 scope.go:117] "RemoveContainer" containerID="2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c" Apr 21 00:27:55.287314 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.287297 2571 scope.go:117] "RemoveContainer" containerID="2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c" Apr 21 00:27:55.287552 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:27:55.287531 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c\": container with ID starting with 2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c not found: ID does not exist" containerID="2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c" Apr 21 00:27:55.287630 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.287559 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c"} err="failed to get container status \"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c\": rpc error: code = NotFound desc = could not find container \"2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c\": container with ID starting with 2565faaec16ddd6b663ef0c0ab94c660d871889f426b03b16ae35310b8b92f6c not found: ID does not exist" Apr 21 00:27:55.299220 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.299192 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:27:55.305049 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:55.305026 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4qrc6"] Apr 21 00:27:56.989744 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:56.989708 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6bdde0-e761-41d5-9996-766786548757" path="/var/lib/kubelet/pods/0c6bdde0-e761-41d5-9996-766786548757/volumes" Apr 21 00:27:58.986503 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:58.986466 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:27:58.990018 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:27:58.989995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:29:00.876223 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.876185 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk"] Apr 21 00:29:00.876623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.876542 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c6bdde0-e761-41d5-9996-766786548757" containerName="manager" Apr 21 00:29:00.876623 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.876553 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6bdde0-e761-41d5-9996-766786548757" containerName="manager" Apr 21 00:29:00.876692 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.876625 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c6bdde0-e761-41d5-9996-766786548757" containerName="manager" Apr 21 00:29:00.876692 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.876633 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a21d3014-2d32-4ecf-a616-8231deaf331e" containerName="cleanup" Apr 21 00:29:00.879526 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.879505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:00.882235 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.882216 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7v22d\"" Apr 21 00:29:00.892719 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.892689 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk"] Apr 21 00:29:00.971783 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.971747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5714cc-ec6c-4481-86ca-81efb87a2760-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:00.971971 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:00.971815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczln\" (UniqueName: \"kubernetes.io/projected/af5714cc-ec6c-4481-86ca-81efb87a2760-kube-api-access-hczln\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.072450 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.072412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hczln\" (UniqueName: \"kubernetes.io/projected/af5714cc-ec6c-4481-86ca-81efb87a2760-kube-api-access-hczln\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.072618 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.072503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5714cc-ec6c-4481-86ca-81efb87a2760-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.072898 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.072870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5714cc-ec6c-4481-86ca-81efb87a2760-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.087610 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.087580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczln\" (UniqueName: \"kubernetes.io/projected/af5714cc-ec6c-4481-86ca-81efb87a2760-kube-api-access-hczln\") pod \"kuadrant-operator-controller-manager-55c7f4c975-knwsk\" (UID: \"af5714cc-ec6c-4481-86ca-81efb87a2760\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.190325 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.190228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.346356 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.346321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk"] Apr 21 00:29:01.349380 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:29:01.349348 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf5714cc_ec6c_4481_86ca_81efb87a2760.slice/crio-746bec0432ce58c180ac82209e87d3d11ab3269a5a60c186486e9677c4b1af23 WatchSource:0}: Error finding container 746bec0432ce58c180ac82209e87d3d11ab3269a5a60c186486e9677c4b1af23: Status 404 returned error can't find the container with id 746bec0432ce58c180ac82209e87d3d11ab3269a5a60c186486e9677c4b1af23 Apr 21 00:29:01.352302 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.352285 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 00:29:01.515565 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.515522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" event={"ID":"af5714cc-ec6c-4481-86ca-81efb87a2760","Type":"ContainerStarted","Data":"7899f5b7ce4680c3facdc45bed90e9dd483b404875c58f78204f5cb24417729f"} Apr 21 00:29:01.515565 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.515567 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" event={"ID":"af5714cc-ec6c-4481-86ca-81efb87a2760","Type":"ContainerStarted","Data":"746bec0432ce58c180ac82209e87d3d11ab3269a5a60c186486e9677c4b1af23"} Apr 21 00:29:01.515878 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.515600 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:01.537796 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:01.537629 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" podStartSLOduration=1.537605299 podStartE2EDuration="1.537605299s" podCreationTimestamp="2026-04-21 00:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:29:01.536510471 +0000 UTC m=+1563.149881991" watchObservedRunningTime="2026-04-21 00:29:01.537605299 +0000 UTC m=+1563.150976821" Apr 21 00:29:12.520939 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:12.520900 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-knwsk" Apr 21 00:29:20.906584 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:20.906545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:29:25.896109 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:25.896058 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:29:51.497169 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:51.497134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:29:57.995959 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:29:57.995923 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:00.131350 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.131313 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:30:00.134461 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.134437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:30:00.136774 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.136746 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-77wmh\"" Apr 21 00:30:00.141441 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.141405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:30:00.278395 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.278353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsqr\" (UniqueName: \"kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr\") pod \"maas-api-key-cleanup-29612190-99vtw\" (UID: \"d88444dc-37ae-4b12-9617-61db8b2334c2\") " pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:30:00.379311 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.379266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsqr\" (UniqueName: \"kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr\") pod \"maas-api-key-cleanup-29612190-99vtw\" (UID: \"d88444dc-37ae-4b12-9617-61db8b2334c2\") " pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:30:00.387187 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.387088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsqr\" (UniqueName: \"kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr\") pod \"maas-api-key-cleanup-29612190-99vtw\" (UID: \"d88444dc-37ae-4b12-9617-61db8b2334c2\") " pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:30:00.448049 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.448002 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:30:00.577274 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.577241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:30:00.580462 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:30:00.580429 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88444dc_37ae_4b12_9617_61db8b2334c2.slice/crio-e6b3452aba5e9e784eb7f76a98b600ad9038c31ba5beb5cec5ba9e709b866aa5 WatchSource:0}: Error finding container e6b3452aba5e9e784eb7f76a98b600ad9038c31ba5beb5cec5ba9e709b866aa5: Status 404 returned error can't find the container with id e6b3452aba5e9e784eb7f76a98b600ad9038c31ba5beb5cec5ba9e709b866aa5 Apr 21 00:30:00.736129 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:00.736025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerStarted","Data":"e6b3452aba5e9e784eb7f76a98b600ad9038c31ba5beb5cec5ba9e709b866aa5"} Apr 21 00:30:01.740750 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:01.740706 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerStarted","Data":"3a12b5f4caba3570eba22e132dc617e6892667129d9048e4b1370e0ffa7e98b7"} Apr 21 00:30:01.755304 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:01.755253 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" podStartSLOduration=1.755239362 podStartE2EDuration="1.755239362s" podCreationTimestamp="2026-04-21 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:30:01.754497524 +0000 UTC m=+1623.367869058" watchObservedRunningTime="2026-04-21 00:30:01.755239362 +0000 UTC m=+1623.368610924" Apr 21 00:30:07.137768 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:07.137734 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:18.001757 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:18.001720 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:21.812105 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:21.812071 2571 generic.go:358] "Generic (PLEG): container finished" podID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerID="3a12b5f4caba3570eba22e132dc617e6892667129d9048e4b1370e0ffa7e98b7" exitCode=6 Apr 21 00:30:21.812475 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:21.812147 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerDied","Data":"3a12b5f4caba3570eba22e132dc617e6892667129d9048e4b1370e0ffa7e98b7"} Apr 21 00:30:21.812518 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:21.812492 2571 scope.go:117] "RemoveContainer" containerID="3a12b5f4caba3570eba22e132dc617e6892667129d9048e4b1370e0ffa7e98b7" Apr 21 00:30:22.817010 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:22.816977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerStarted","Data":"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8"} Apr 21 00:30:26.401708 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:26.401670 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:37.300145 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:37.300101 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:42.892592 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:42.892556 2571 generic.go:358] "Generic (PLEG): container finished" podID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" exitCode=6 Apr 21 00:30:42.893029 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:42.892629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerDied","Data":"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8"} Apr 21 00:30:42.893029 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:42.892670 2571 scope.go:117] "RemoveContainer" containerID="3a12b5f4caba3570eba22e132dc617e6892667129d9048e4b1370e0ffa7e98b7" Apr 21 00:30:42.893147 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:42.893036 2571 scope.go:117] "RemoveContainer" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" Apr 21 00:30:42.893294 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:30:42.893275 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612190-99vtw_opendatahub(d88444dc-37ae-4b12-9617-61db8b2334c2)\"" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" Apr 21 00:30:46.497788 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:46.497748 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:55.986134 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:55.986075 2571 scope.go:117] "RemoveContainer" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" Apr 21 00:30:56.399582 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:56.399548 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:30:56.945770 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:56.945734 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerStarted","Data":"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a"} Apr 21 00:30:57.011629 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:57.011594 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:30:57.950337 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:30:57.950281 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" containerID="cri-o://0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a" gracePeriod=30 Apr 21 00:31:05.903012 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:05.902972 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:31:15.506620 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:15.506580 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:31:17.000719 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.000696 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:31:17.018280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.018197 2571 generic.go:358] "Generic (PLEG): container finished" podID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerID="0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a" exitCode=6 Apr 21 00:31:17.018280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.018259 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" Apr 21 00:31:17.018465 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.018265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerDied","Data":"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a"} Apr 21 00:31:17.018465 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.018383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612190-99vtw" event={"ID":"d88444dc-37ae-4b12-9617-61db8b2334c2","Type":"ContainerDied","Data":"e6b3452aba5e9e784eb7f76a98b600ad9038c31ba5beb5cec5ba9e709b866aa5"} Apr 21 00:31:17.018465 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.018414 2571 scope.go:117] "RemoveContainer" containerID="0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a" Apr 21 00:31:17.027811 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.027788 2571 scope.go:117] "RemoveContainer" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" Apr 21 00:31:17.035802 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.035784 2571 scope.go:117] "RemoveContainer" containerID="0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a" Apr 21 00:31:17.036080 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:31:17.036060 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a\": container with ID starting with 0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a not found: ID does not exist" containerID="0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a" Apr 21 00:31:17.036186 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.036113 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a"} err="failed to get container status \"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a\": rpc error: code = NotFound desc = could not find container \"0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a\": container with ID starting with 0d841945eabd112840dad4ea586e0b99a34abf27cf17837a72af0ddff9fb999a not found: ID does not exist" Apr 21 00:31:17.036186 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.036137 2571 scope.go:117] "RemoveContainer" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" Apr 21 00:31:17.036422 ip-10-0-143-115 kubenswrapper[2571]: E0421 00:31:17.036403 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8\": container with ID starting with edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8 not found: ID does not exist" containerID="edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8" Apr 21 00:31:17.036518 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.036432 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8"} err="failed to get container status \"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8\": rpc error: code = NotFound desc = could not find container \"edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8\": container with ID starting with edb47aacaeedf9f945c3acd0fb404c7f52f9c3d49563f1aec31af67a6c8859d8 not found: ID does not exist" Apr 21 00:31:17.145210 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.145175 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsqr\" (UniqueName: \"kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr\") pod \"d88444dc-37ae-4b12-9617-61db8b2334c2\" (UID: \"d88444dc-37ae-4b12-9617-61db8b2334c2\") " Apr 21 00:31:17.153054 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.153015 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr" (OuterVolumeSpecName: "kube-api-access-qpsqr") pod "d88444dc-37ae-4b12-9617-61db8b2334c2" (UID: "d88444dc-37ae-4b12-9617-61db8b2334c2"). InnerVolumeSpecName "kube-api-access-qpsqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 00:31:17.248319 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.248283 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpsqr\" (UniqueName: \"kubernetes.io/projected/d88444dc-37ae-4b12-9617-61db8b2334c2-kube-api-access-qpsqr\") on node \"ip-10-0-143-115.ec2.internal\" DevicePath \"\"" Apr 21 00:31:17.339971 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.339937 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:31:17.343760 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:17.343732 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612190-99vtw"] Apr 21 00:31:18.990606 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:18.990573 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" path="/var/lib/kubelet/pods/d88444dc-37ae-4b12-9617-61db8b2334c2/volumes" Apr 21 00:31:24.705941 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:24.705906 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:31:58.397220 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:31:58.397173 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:32:40.697842 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:32:40.697804 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:32:49.098181 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:32:49.098086 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:32:57.600966 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:32:57.600929 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:32:59.011404 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:32:59.011378 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:32:59.021539 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:32:59.021516 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:33:06.397210 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:06.397174 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:14.900776 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:14.900737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:26.593325 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:26.593285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:34.703749 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:34.703712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:43.412713 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:43.412670 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:51.403689 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:51.403653 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:33:59.498668 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:33:59.498633 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:34:07.896703 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:34:07.896667 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:34:18.096225 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:34:18.096118 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:34:35.800950 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:34:35.800910 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:34:44.201026 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:34:44.200984 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:34:53.399976 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:34:53.399936 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:01.692243 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:01.692202 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:19.624774 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:19.624733 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:26.600906 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:26.600864 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:35.397835 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:35.397798 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:43.701329 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:43.701290 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:35:53.004037 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:35:53.004000 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:01.802197 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:01.802152 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:12.294778 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:12.294739 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:22.492017 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:22.491980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:31.802935 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:31.802899 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:41.900043 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:41.899987 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:50.601244 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:50.601208 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:36:58.897272 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:36:58.897234 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:07.414131 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:07.414081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:15.596857 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:15.596815 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:32.699960 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:32.699875 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:41.216974 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:41.216940 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:49.101182 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:49.101148 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:54.304698 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:54.304654 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:37:59.038157 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:59.038128 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:37:59.046190 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:37:59.046166 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:38:20.007779 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:20.007734 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:38:31.694830 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:31.694789 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m4tp9"] Apr 21 00:38:40.984993 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:40.984952 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rsvtb_55e50a99-fa92-4a13-8811-3ef46753ab44/manager/0.log" Apr 21 00:38:41.221062 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:41.221026 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-779779f6b6-nkvzw_97ff5bbb-4b30-46d1-b6f4-3319bbdf0757/manager/0.log" Apr 21 00:38:41.344189 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:41.344146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-df9nr_aca2acb4-b224-4ea6-95d4-acfc142efd80/manager/0.log" Apr 21 00:38:41.580998 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:41.580968 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-587f5698df-7dp74_34695680-8474-417f-927b-85cfda2c3b18/manager/0.log" Apr 21 00:38:43.344199 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:43.344163 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-qb8p6_466b6843-581d-41df-9774-68a94c072b4f/manager/0.log" Apr 21 00:38:43.695474 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:43.695381 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-knwsk_af5714cc-ec6c-4481-86ca-81efb87a2760/manager/0.log" Apr 21 00:38:43.805968 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:43.805940 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m4tp9_0f61a0ed-6b8e-488e-85be-e8868739e0e1/limitador/0.log" Apr 21 00:38:44.382620 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:44.382574 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gpv7z_175da6a9-33ab-4c24-b1aa-2909592e217a/discovery/0.log" Apr 21 00:38:44.486596 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:44.486572 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-f6b6bc78b-8qr6r_56da5d14-f885-45da-9fb2-726093fe1373/kube-auth-proxy/0.log" Apr 21 00:38:44.822627 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:44.822592 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69675cd558-ml7nv_e4103e8a-c222-4304-953c-43f57e73acef/router/0.log" Apr 21 00:38:45.271693 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:45.271652 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f_b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b/storage-initializer/0.log" Apr 21 00:38:45.279247 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:45.279223 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-ldl5f_b96b337d-4ff4-4a8b-8bba-1c7fb5534d5b/main/0.log" Apr 21 00:38:45.637593 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:45.637515 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5_408601c5-4214-43f1-b7ef-7271b1f9bd67/storage-initializer/0.log" Apr 21 00:38:45.644855 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:45.644831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-m7mm5_408601c5-4214-43f1-b7ef-7271b1f9bd67/main/0.log" Apr 21 00:38:52.082053 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:52.082014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8n24l_d4c3f54e-8135-4a92-b7dc-1bef279e0201/global-pull-secret-syncer/0.log" Apr 21 00:38:52.279669 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:52.279639 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zsbmm_728512b6-8990-45f9-b0fa-89772f9c1362/konnectivity-agent/0.log" Apr 21 00:38:52.353579 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:52.353498 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-115.ec2.internal_b1cf2d95e85fe20270acb11fb3142e37/haproxy/0.log" Apr 21 00:38:56.056214 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:56.056184 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-qb8p6_466b6843-581d-41df-9774-68a94c072b4f/manager/0.log" Apr 21 00:38:56.236023 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:56.235918 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-knwsk_af5714cc-ec6c-4481-86ca-81efb87a2760/manager/0.log" Apr 21 00:38:56.260211 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:56.260185 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m4tp9_0f61a0ed-6b8e-488e-85be-e8868739e0e1/limitador/0.log" Apr 21 00:38:57.969208 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:57.969172 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-cn6dq_4854cd30-d0eb-4603-8e32-c7919b625f6c/cluster-monitoring-operator/0.log" Apr 21 00:38:58.210117 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:58.210074 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjvz8_141eeb26-5732-4466-8de8-641334400202/node-exporter/0.log" Apr 21 00:38:58.230846 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:58.230769 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjvz8_141eeb26-5732-4466-8de8-641334400202/kube-rbac-proxy/0.log" Apr 21 00:38:58.250307 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:38:58.250283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjvz8_141eeb26-5732-4466-8de8-641334400202/init-textfile/0.log" Apr 21 00:39:00.292107 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:00.292061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-m2p9z_8ebef9e6-96c0-4de0-8ad0-70daf0b29f1e/networking-console-plugin/0.log" Apr 21 00:39:00.865935 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:00.865898 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/1.log" Apr 21 00:39:00.874184 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:00.874145 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-p9l6p_e91dab5b-ef33-493b-9cc9-99941410ef37/console-operator/2.log" Apr 21 00:39:01.609699 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.609659 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6"] Apr 21 00:39:01.610225 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610205 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610229 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610249 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610280 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610255 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610374 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610322 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610374 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610332 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.610374 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.610340 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88444dc-37ae-4b12-9617-61db8b2334c2" containerName="cleanup" Apr 21 00:39:01.613135 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.613088 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.615951 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.615928 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gc8bw\"/\"kube-root-ca.crt\"" Apr 21 00:39:01.617078 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.617058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gc8bw\"/\"default-dockercfg-45ldp\"" Apr 21 00:39:01.617152 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.617058 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gc8bw\"/\"openshift-service-ca.crt\"" Apr 21 00:39:01.622161 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.622139 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6"] Apr 21 00:39:01.672282 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.672249 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-sys\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.672466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.672300 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-proc\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.672466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.672341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-podres\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.672466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.672441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5zt\" (UniqueName: \"kubernetes.io/projected/45852c69-8a5c-4cb6-b6e2-34df8d69f898-kube-api-access-dl5zt\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.672594 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.672469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-lib-modules\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773120 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5zt\" (UniqueName: \"kubernetes.io/projected/45852c69-8a5c-4cb6-b6e2-34df8d69f898-kube-api-access-dl5zt\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-lib-modules\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-sys\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-proc\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-podres\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-sys\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773322 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-lib-modules\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773584 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-proc\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.773584 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.773376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45852c69-8a5c-4cb6-b6e2-34df8d69f898-podres\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.781203 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.781173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5zt\" (UniqueName: \"kubernetes.io/projected/45852c69-8a5c-4cb6-b6e2-34df8d69f898-kube-api-access-dl5zt\") pod \"perf-node-gather-daemonset-ddxh6\" (UID: \"45852c69-8a5c-4cb6-b6e2-34df8d69f898\") " pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:01.910617 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.910534 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-d4kjt_a2f4bd8a-7dae-45c0-9b1e-a5c145a09876/volume-data-source-validator/0.log" Apr 21 00:39:01.923741 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:01.923715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:02.048911 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.048877 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6"] Apr 21 00:39:02.051254 ip-10-0-143-115 kubenswrapper[2571]: W0421 00:39:02.051221 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod45852c69_8a5c_4cb6_b6e2_34df8d69f898.slice/crio-79f150d9c9aa9cf7cdb51357be83ddfa4fb62a6fa824b6b9d63b1bd5f83a40dd WatchSource:0}: Error finding container 79f150d9c9aa9cf7cdb51357be83ddfa4fb62a6fa824b6b9d63b1bd5f83a40dd: Status 404 returned error can't find the container with id 79f150d9c9aa9cf7cdb51357be83ddfa4fb62a6fa824b6b9d63b1bd5f83a40dd Apr 21 00:39:02.052966 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.052944 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 00:39:02.667269 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.667225 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" event={"ID":"45852c69-8a5c-4cb6-b6e2-34df8d69f898","Type":"ContainerStarted","Data":"02ab4550d51b5565b209cd57d2896678bc1dbcf646943b5f5f5571214f3ef216"} Apr 21 00:39:02.667269 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.667274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" event={"ID":"45852c69-8a5c-4cb6-b6e2-34df8d69f898","Type":"ContainerStarted","Data":"79f150d9c9aa9cf7cdb51357be83ddfa4fb62a6fa824b6b9d63b1bd5f83a40dd"} Apr 21 00:39:02.667759 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.667340 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:02.684243 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.684199 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" podStartSLOduration=1.6841865839999999 podStartE2EDuration="1.684186584s" podCreationTimestamp="2026-04-21 00:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 00:39:02.681894674 +0000 UTC m=+2164.295266199" watchObservedRunningTime="2026-04-21 00:39:02.684186584 +0000 UTC m=+2164.297558106" Apr 21 00:39:02.810489 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.810457 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nxkjd_5dca0c9f-ca96-4377-bb4d-280b9c469ca1/dns/0.log" Apr 21 00:39:02.834017 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.833993 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nxkjd_5dca0c9f-ca96-4377-bb4d-280b9c469ca1/kube-rbac-proxy/0.log" Apr 21 00:39:02.902113 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:02.902069 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gwszj_918f7c2d-8780-4291-b141-5fb77d94b6cf/dns-node-resolver/0.log" Apr 21 00:39:03.409227 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:03.409196 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29612160-ghs4n_8c729e77-783c-4ed2-831a-538689f33279/image-pruner/0.log" Apr 21 00:39:03.489864 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:03.489816 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9hvww_dbf6c72e-bf6b-47f7-bac9-1b24d4a37975/node-ca/0.log" Apr 21 00:39:04.526753 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:04.526718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gpv7z_175da6a9-33ab-4c24-b1aa-2909592e217a/discovery/0.log" Apr 21 00:39:04.545454 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:04.545423 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-f6b6bc78b-8qr6r_56da5d14-f885-45da-9fb2-726093fe1373/kube-auth-proxy/0.log" Apr 21 00:39:04.652770 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:04.652739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69675cd558-ml7nv_e4103e8a-c222-4304-953c-43f57e73acef/router/0.log" Apr 21 00:39:05.148307 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.148276 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6w5rw_e28815ec-1f97-4757-b463-8aec1ad6b01e/serve-healthcheck-canary/0.log" Apr 21 00:39:05.636773 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.636740 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b4dln_73af315a-43db-4687-aa2e-2555ab2f3d65/insights-operator/0.log" Apr 21 00:39:05.637284 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.637119 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b4dln_73af315a-43db-4687-aa2e-2555ab2f3d65/insights-operator/1.log" Apr 21 00:39:05.791882 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.791855 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sb2tk_44e13f0c-c185-4594-8599-652d3fba3595/kube-rbac-proxy/0.log" Apr 21 00:39:05.810567 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.810543 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sb2tk_44e13f0c-c185-4594-8599-652d3fba3595/exporter/0.log" Apr 21 00:39:05.830504 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:05.830475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sb2tk_44e13f0c-c185-4594-8599-652d3fba3595/extractor/0.log" Apr 21 00:39:07.798850 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:07.798811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rsvtb_55e50a99-fa92-4a13-8811-3ef46753ab44/manager/0.log" Apr 21 00:39:07.909056 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:07.909024 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-779779f6b6-nkvzw_97ff5bbb-4b30-46d1-b6f4-3319bbdf0757/manager/0.log" Apr 21 00:39:07.949001 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:07.948965 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-df9nr_aca2acb4-b224-4ea6-95d4-acfc142efd80/manager/0.log" Apr 21 00:39:07.996520 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:07.996492 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-587f5698df-7dp74_34695680-8474-417f-927b-85cfda2c3b18/manager/0.log" Apr 21 00:39:08.682961 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:08.682935 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gc8bw/perf-node-gather-daemonset-ddxh6" Apr 21 00:39:14.207169 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:14.207136 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-m59z4_010e4dff-e2ae-4168-aa42-10e2537edc3c/kube-storage-version-migrator-operator/1.log" Apr 21 00:39:14.209027 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:14.208995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-m59z4_010e4dff-e2ae-4168-aa42-10e2537edc3c/kube-storage-version-migrator-operator/0.log" Apr 21 00:39:15.344703 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.344664 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/kube-multus-additional-cni-plugins/0.log" Apr 21 00:39:15.369732 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.369705 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/egress-router-binary-copy/0.log" Apr 21 00:39:15.388677 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.388650 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/cni-plugins/0.log" Apr 21 00:39:15.408485 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.408463 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/bond-cni-plugin/0.log" Apr 21 00:39:15.429464 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.429435 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/routeoverride-cni/0.log" Apr 21 00:39:15.449670 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.449644 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/whereabouts-cni-bincopy/0.log" Apr 21 00:39:15.471359 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.471336 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mghn5_f398c142-f284-48f9-b608-6eb7229425ae/whereabouts-cni/0.log" Apr 21 00:39:15.703525 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.703441 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hvbch_b578ee7e-4063-4906-b849-ca0d856e3c15/kube-multus/0.log" Apr 21 00:39:15.752486 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.752451 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6fz9j_173d74c8-1f07-4764-a03f-8091e02dc212/network-metrics-daemon/0.log" Apr 21 00:39:15.769663 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:15.769635 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6fz9j_173d74c8-1f07-4764-a03f-8091e02dc212/kube-rbac-proxy/0.log" Apr 21 00:39:16.728498 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.728468 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/ovn-controller/0.log" Apr 21 00:39:16.764943 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.764914 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/ovn-acl-logging/0.log" Apr 21 00:39:16.783770 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.783747 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/kube-rbac-proxy-node/0.log" Apr 21 00:39:16.802684 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.802663 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 00:39:16.818610 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.818583 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/northd/0.log" Apr 21 00:39:16.838160 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.838137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/nbdb/0.log" Apr 21 00:39:16.857466 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:16.857444 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/sbdb/0.log" Apr 21 00:39:17.026908 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:17.026864 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2nrh8_012b4bee-5b6f-4bec-9704-f110e7aba3eb/ovnkube-controller/0.log" Apr 21 00:39:18.709993 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:18.709965 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-jtsx5_fb84b042-37b8-43f5-94e4-43fb54d2041b/check-endpoints/0.log" Apr 21 00:39:18.777948 ip-10-0-143-115 kubenswrapper[2571]: I0421 00:39:18.777921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-s2mw9_a10f7678-f6da-46bb-86eb-c0de2afb421c/network-check-target-container/0.log"