Apr 17 16:16:45.549901 ip-10-0-134-77 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:16:45.549916 ip-10-0-134-77 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:16:45.549926 ip-10-0-134-77 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:16:45.550304 ip-10-0-134-77 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:16:56.989728 ip-10-0-134-77 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:16:56.989747 ip-10-0-134-77 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5fd726b9b1f94de79323b00ccc06b62a -- Apr 17 16:19:29.519769 ip-10-0-134-77 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:19:29.934908 ip-10-0-134-77 kubenswrapper[2584]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:29.934908 ip-10-0-134-77 kubenswrapper[2584]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:19:29.934908 ip-10-0-134-77 kubenswrapper[2584]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:29.934908 ip-10-0-134-77 kubenswrapper[2584]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:19:29.935530 ip-10-0-134-77 kubenswrapper[2584]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:29.936762 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.936685 2584 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:19:29.941102 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941080 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:29.941102 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941098 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:29.941102 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941102 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:29.941102 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941106 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:29.941102 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941109 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941112 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941116 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941119 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941122 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941125 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941127 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941131 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941133 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941136 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941139 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941141 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941144 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941147 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941149 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941152 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941154 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941157 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941159 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941162 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:29.941288 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941164 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941167 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941177 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941180 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941182 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941185 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941187 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941190 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941192 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941195 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941197 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941200 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941203 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941205 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941209 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941212 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941215 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941218 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941220 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:29.941783 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941223 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941225 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941228 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941231 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941233 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941236 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941238 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941241 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941243 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941246 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941248 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941251 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941253 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941256 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941258 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941261 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941263 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941266 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941268 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941271 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941274 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:29.942259 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941276 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941279 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941282 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941286 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941289 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941292 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941296 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941299 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941302 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941304 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941307 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941310 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941312 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941316 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941322 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941325 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941328 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941331 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941334 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:29.942792 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941336 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941339 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941341 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941768 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941774 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941777 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941780 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941782 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941786 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941789 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941791 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941794 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941797 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941800 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941802 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941805 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941808 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941810 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941813 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941816 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:29.943243 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941819 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941821 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941824 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941827 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941829 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941832 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941836 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941841 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941844 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941847 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941850 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941852 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941855 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941857 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941860 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941862 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941865 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941867 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941870 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:29.943842 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941872 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941875 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941877 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941880 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941883 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941886 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941888 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941891 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941910 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941914 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941918 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941921 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941924 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941927 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941930 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941933 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941935 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941938 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941941 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941943 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:29.944346 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941946 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941949 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941952 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941954 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941957 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941959 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941962 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941964 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941966 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941969 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941971 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941974 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941976 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941979 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941981 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941985 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941987 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941990 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941993 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:29.944850 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.941997 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942000 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942003 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942006 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942008 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942011 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942014 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942016 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942019 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942021 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.942024 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943562 2584 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943571 2584 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943578 2584 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943583 2584 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943588 2584 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943591 2584 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943596 2584 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943600 2584 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943603 2584 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943606 2584 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:19:29.945332 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943610 2584 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943613 2584 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943616 2584 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943619 2584 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943622 2584 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943624 2584 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943627 2584 flags.go:64] FLAG: --cloud-config="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943630 2584 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943633 2584 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943636 2584 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943639 2584 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943642 2584 flags.go:64] FLAG: --config-dir="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943645 2584 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943648 2584 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943652 2584 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943655 2584 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943659 2584 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943662 2584 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943665 2584 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943668 2584 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943671 2584 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943674 2584 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943677 2584 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943681 2584 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943684 2584 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:19:29.945870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943686 2584 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943690 2584 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943693 2584 flags.go:64] FLAG: --enable-server="true" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943696 2584 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943700 2584 flags.go:64] FLAG: --event-burst="100" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943703 2584 flags.go:64] FLAG: --event-qps="50" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943706 2584 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943709 2584 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943713 2584 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943716 2584 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943719 2584 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943722 2584 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943725 2584 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943728 2584 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943731 2584 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943733 2584 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943737 2584 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943739 2584 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943742 2584 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943745 2584 flags.go:64] FLAG: --feature-gates="" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943749 2584 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943752 2584 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943755 2584 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943758 2584 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943761 2584 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:19:29.946479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943764 2584 flags.go:64] FLAG: --help="false" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943767 2584 flags.go:64] FLAG: --hostname-override="ip-10-0-134-77.ec2.internal" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943770 2584 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943773 2584 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943776 2584 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943779 2584 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943783 2584 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943786 2584 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943788 2584 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943791 2584 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943794 2584 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943797 2584 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943800 2584 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943803 2584 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943805 2584 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943808 2584 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943811 2584 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943814 2584 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943816 2584 flags.go:64] FLAG: --lock-file="" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943819 2584 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943822 2584 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943825 2584 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943831 2584 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:19:29.947089 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943834 2584 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943836 2584 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943839 2584 flags.go:64] FLAG: --logging-format="text" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943842 2584 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943845 2584 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943848 2584 flags.go:64] FLAG: --manifest-url="" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943851 2584 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943855 2584 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943859 2584 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943863 2584 flags.go:64] FLAG: --max-pods="110" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943866 2584 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943868 2584 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943871 2584 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943874 2584 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943877 2584 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943880 2584 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943882 2584 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943890 2584 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943893 2584 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943896 2584 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943899 2584 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943902 2584 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943907 2584 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943910 2584 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:19:29.947667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943912 2584 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943915 2584 flags.go:64] FLAG: --port="10250" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943918 2584 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943921 2584 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0abd0022965c72e12" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943924 2584 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943931 2584 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943934 2584 flags.go:64] FLAG: --register-node="true" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943937 2584 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943940 2584 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943943 2584 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943946 2584 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943949 2584 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943952 2584 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943956 2584 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943959 2584 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943962 2584 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943965 2584 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943968 2584 flags.go:64] FLAG: --runonce="false" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943970 2584 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943974 2584 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943977 2584 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943979 2584 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943982 2584 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943985 2584 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943988 2584 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943991 2584 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:19:29.948222 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943994 2584 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943997 2584 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.943999 2584 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944003 2584 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944006 2584 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944010 2584 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944013 2584 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944018 2584 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944021 2584 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944024 2584 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944027 2584 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944031 2584 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944034 2584 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944037 2584 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944040 2584 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944043 2584 flags.go:64] FLAG: --v="2" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944047 2584 flags.go:64] FLAG: --version="false" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944051 2584 flags.go:64] FLAG: --vmodule="" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944056 2584 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944059 2584 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944163 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944166 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944169 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944172 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:29.948855 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944175 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944178 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944180 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944183 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944185 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944188 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944190 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944193 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944195 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944198 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944200 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944204 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944206 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944209 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944212 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944215 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944217 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944221 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944225 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944229 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:29.949446 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944232 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944235 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944238 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944240 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944243 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944245 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944248 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944250 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944255 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944257 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944260 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944262 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944265 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944267 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944270 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944273 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944275 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944278 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944280 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944283 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:29.950015 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944285 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944288 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944290 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944293 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944295 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944299 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944301 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944304 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944306 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944309 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944311 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944315 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944317 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944320 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944322 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944325 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944327 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944329 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944332 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944334 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:29.950526 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944338 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944341 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944343 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944346 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944349 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944352 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944354 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944357 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944360 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944362 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944365 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944367 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944370 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944372 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944375 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944377 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944380 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944382 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944385 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:29.951008 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944388 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944392 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.944395 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.944405 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.950779 2584 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.950798 2584 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950846 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950851 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950854 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950857 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950860 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950863 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950865 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950868 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950870 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950873 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:29.951482 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950875 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950877 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950880 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950883 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950885 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950888 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950890 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950893 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950895 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950897 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950900 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950902 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950906 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950910 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950913 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950916 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950920 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950922 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950925 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:29.951930 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950928 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950930 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950934 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950937 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950939 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950942 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950944 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950948 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950950 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950952 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950955 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950957 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950960 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950963 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950965 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950968 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950970 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950973 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950975 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950978 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:29.952392 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950980 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950983 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950985 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950988 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950990 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950992 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950995 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.950997 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951000 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951003 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951005 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951008 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951010 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951013 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951017 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951019 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951022 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951025 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951028 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951030 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:29.952926 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951033 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951035 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951037 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951040 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951042 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951045 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951048 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951050 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951052 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951055 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951057 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951060 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951062 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951064 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951067 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951069 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:29.953444 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951072 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.951077 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951174 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951178 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951181 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951184 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951187 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951190 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951193 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951196 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951199 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951202 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951205 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951207 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951209 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951212 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:29.953838 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951214 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951217 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951219 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951222 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951224 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951227 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951230 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951232 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951235 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951237 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951240 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951242 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951245 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951247 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951250 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951252 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951255 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951257 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951259 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:29.954248 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951262 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951264 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951267 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951269 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951272 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951274 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951276 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951279 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951282 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951285 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951287 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951289 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951292 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951294 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951297 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951299 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951302 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951304 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951307 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951309 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:29.954713 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951311 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951314 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951316 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951318 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951322 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951325 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951328 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951331 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951334 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951337 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951340 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951342 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951345 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951347 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951349 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951352 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951354 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951357 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951360 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:29.955406 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951362 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951365 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951368 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951370 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951373 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951375 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951378 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951380 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951382 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951385 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951387 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951389 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951392 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:29.951394 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.951399 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:29.956106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.952012 2584 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:19:29.956532 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.954753 2584 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:19:29.956532 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.955642 2584 server.go:1019] "Starting client certificate rotation" Apr 17 16:19:29.956532 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.955753 2584 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:19:29.956532 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.955797 2584 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:19:29.976918 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.976898 2584 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:19:29.979198 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.979171 2584 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:19:29.991751 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.991733 2584 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:19:29.996862 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.996845 2584 log.go:25] "Validated CRI v1 image API" Apr 17 16:19:29.998147 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:29.998133 2584 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:19:30.003558 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.003537 2584 fs.go:135] Filesystem UUIDs: map[20e12bc0-ffa1-4a52-9af1-db3ce8406eef:/dev/nvme0n1p4 3a5f89f4-a99e-4cd3-b25d-8d992b3a5392:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 16:19:30.003642 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.003559 2584 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:19:30.005769 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.005748 2584 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:19:30.009625 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.009521 2584 manager.go:217] Machine: {Timestamp:2026-04-17 16:19:30.007656327 +0000 UTC m=+0.381569260 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3125948 MemoryCapacity:32812179456 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b22828d6b22de2346e6ab34387634 SystemUUID:ec2b2282-8d6b-22de-2346-e6ab34387634 BootID:5fd726b9-b1f9-4de7-9323-b00ccc06b62a Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406089728 Type:vfs Inodes:4005393 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562439168 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3d:82:68:61:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3d:82:68:61:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:b5:ad:e7:88:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812179456 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:19:30.009625 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.009619 2584 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:19:30.009756 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.009690 2584 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:19:30.010671 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.010647 2584 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:19:30.010804 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.010672 2584 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-77.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:19:30.010851 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.010815 2584 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:19:30.010851 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.010823 2584 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:19:30.010851 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.010835 2584 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:19:30.011573 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.011563 2584 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:19:30.012704 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.012694 2584 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:19:30.012803 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.012794 2584 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:19:30.015456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.015446 2584 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:19:30.015988 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.015976 2584 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:19:30.016027 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.016002 2584 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:19:30.016027 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.016014 2584 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:19:30.016027 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.016027 2584 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:19:30.016995 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.016983 2584 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:19:30.017044 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.017003 2584 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:19:30.020168 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.020150 2584 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:19:30.021842 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.021829 2584 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:19:30.023033 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023017 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:19:30.023085 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023048 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:19:30.023085 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023061 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:19:30.023085 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023071 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:19:30.023085 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023083 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023094 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023106 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023117 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023131 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023143 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023177 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:19:30.023255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.023194 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:19:30.024021 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.024002 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:19:30.024106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.024023 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:19:30.028065 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.028014 2584 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:19:30.028407 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.028396 2584 server.go:1295] "Started kubelet" Apr 17 16:19:30.029161 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.029110 2584 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:19:30.029489 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.029169 2584 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:19:30.029591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.029567 2584 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:19:30.029680 ip-10-0-134-77 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:19:30.030230 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.030179 2584 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-77.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:19:30.030799 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.030648 2584 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:19:30.031093 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.031071 2584 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:19:30.031176 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.031083 2584 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:19:30.033389 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.033373 2584 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:19:30.036130 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.035098 2584 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a731498f56fa28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-17 16:19:30.028165672 +0000 UTC m=+0.402078614,LastTimestamp:2026-04-17 16:19:30.028165672 +0000 UTC m=+0.402078614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 17 16:19:30.036739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.036714 2584 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:19:30.037279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.037263 2584 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:19:30.037875 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.037859 2584 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:19:30.037875 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.037860 2584 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:19:30.037875 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.037880 2584 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:19:30.038210 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.038040 2584 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:19:30.038210 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.038050 2584 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:19:30.038210 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.038087 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.040326 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.040310 2584 factory.go:55] Registering systemd factory Apr 17 16:19:30.040430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.040331 2584 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:19:30.041078 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041059 2584 factory.go:153] Registering CRI-O factory Apr 17 16:19:30.041078 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041081 2584 factory.go:223] Registration of the crio container factory successfully Apr 17 16:19:30.041205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041133 2584 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:19:30.041205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041159 2584 factory.go:103] Registering Raw factory Apr 17 16:19:30.041205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041173 2584 manager.go:1196] Started watching for new ooms in manager Apr 17 16:19:30.041862 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.041614 2584 manager.go:319] Starting recovery of all containers Apr 17 16:19:30.044561 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.044346 2584 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:19:30.051171 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.049876 2584 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:19:30.051171 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.050020 2584 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:19:30.051171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.050630 2584 manager.go:324] Recovery completed Apr 17 16:19:30.056026 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.056011 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.059688 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.059665 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.059760 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.059702 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.059760 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.059712 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.060208 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.060179 2584 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:19:30.060208 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.060199 2584 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:19:30.060273 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.060220 2584 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:19:30.062173 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.062162 2584 policy_none.go:49] "None policy: Start" Apr 17 16:19:30.062218 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.062178 2584 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:19:30.062218 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.062187 2584 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:19:30.069525 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.069415 2584 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a731499137fa82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-77.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-17 16:19:30.059688578 +0000 UTC m=+0.433601511,LastTimestamp:2026-04-17 16:19:30.059688578 +0000 UTC m=+0.433601511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 17 16:19:30.071292 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.071227 2584 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-htqbh" Apr 17 16:19:30.075881 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.075816 2584 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a7314991384203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-77.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-17 16:19:30.059706883 +0000 UTC m=+0.433619816,LastTimestamp:2026-04-17 16:19:30.059706883 +0000 UTC m=+0.433619816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 17 16:19:30.078146 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.078124 2584 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-htqbh" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098027 2584 manager.go:341] "Starting Device Plugin manager" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.098053 2584 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098062 2584 server.go:85] "Starting device plugin registration server" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098240 2584 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098249 2584 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098371 2584 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098431 2584 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.098437 2584 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.098919 2584 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:19:30.103247 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.098952 2584 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.183201 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.183160 2584 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:19:30.184450 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.184429 2584 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:19:30.184585 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.184458 2584 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:19:30.184585 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.184477 2584 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:19:30.184585 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.184485 2584 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:19:30.184585 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.184538 2584 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:19:30.187751 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.187708 2584 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:30.198600 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.198588 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.199602 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.199587 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.199674 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.199620 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.199674 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.199635 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.199674 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.199667 2584 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.209549 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.209535 2584 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.209635 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.209558 2584 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-77.ec2.internal\": node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.221536 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.221515 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.284855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.284832 2584 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal"] Apr 17 16:19:30.284924 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.284905 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.286210 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.286197 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.286264 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.286224 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.286264 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.286239 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.287363 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.287350 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.287507 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.287481 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.287546 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.287528 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.288056 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288033 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.288139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288065 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.288139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288076 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.288139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288038 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.288252 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288141 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.288252 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.288155 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.289728 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.289716 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.289771 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.289743 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:30.290350 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.290335 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:30.290412 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.290357 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:30.290412 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.290370 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:30.321097 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.321074 2584 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-77.ec2.internal\" not found" node="ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.321709 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.321689 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.325321 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.325306 2584 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-77.ec2.internal\" not found" node="ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.339952 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.339937 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.340023 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.339961 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.340023 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.339978 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.422643 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.422609 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.441117 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441065 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.441117 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441093 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.441117 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441112 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.441279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441157 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.441279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441166 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.441279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.441191 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.523520 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.523467 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.623088 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.623059 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.624070 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.624048 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.627485 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.627465 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.724440 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.724353 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.824881 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.824854 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.925323 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.925299 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 17 16:19:30.928978 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.928960 2584 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:30.937983 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.937965 2584 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.948032 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.948010 2584 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:19:30.949547 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.949535 2584 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:30.955477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.955464 2584 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:19:30.955591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.955575 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:19:30.955646 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:30.955630 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:19:30.955696 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:30.955644 2584 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a193e3cc5d13f423981993119da8d00b-a9a85d8ed83075d5.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.134.77:35424->44.214.63.97:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 17 16:19:31.016814 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.016760 2584 apiserver.go:52] "Watching apiserver" Apr 17 16:19:31.032490 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.032470 2584 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:19:31.032826 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.032805 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-v59gz","kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal","openshift-image-registry/node-ca-cdxj4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-8qvmb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn","openshift-cluster-node-tuning-operator/tuned-bbp92","openshift-dns/node-resolver-976rk","openshift-multus/multus-additional-cni-plugins-p6896","openshift-multus/multus-s848g","openshift-multus/network-metrics-daemon-cgrms","openshift-network-diagnostics/network-check-target-m64jw","openshift-network-operator/iptables-alerter-fcm59"] Apr 17 16:19:31.035176 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.035156 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.036205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.036188 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.037321 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.037302 2584 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:19:31.037408 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.037379 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.037907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.037816 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:19:31.037907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.037822 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:19:31.038036 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.038020 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wdh86\"" Apr 17 16:19:31.038424 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.038391 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.038861 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.038400 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.038861 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.038677 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:19:31.039016 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.038882 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.039016 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.039004 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jx9nj\"" Apr 17 16:19:31.040235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040214 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:19:31.040235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040217 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:19:31.040375 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040258 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:19:31.040375 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040222 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.040375 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040307 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.040834 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040819 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:19:31.040940 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040857 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tzvxd\"" Apr 17 16:19:31.040940 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040912 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.041092 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.040944 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.041092 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.041072 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.041092 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.041080 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.041403 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.041388 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:19:31.041477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.041431 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h7pmc\"" Apr 17 16:19:31.042193 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.042178 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.043307 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.043291 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.043381 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.043351 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.043577 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.043561 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s848g" Apr 17 16:19:31.043650 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.043589 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q95ft\"" Apr 17 16:19:31.044345 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.043976 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.044345 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044048 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vl8tq\"" Apr 17 16:19:31.044491 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044373 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-var-lib-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.044491 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044423 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-etc-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.044635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044518 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.044635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044516 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-config\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.044635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044592 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-env-overrides\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.044635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044630 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-socket-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.044896 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044688 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8l7s\" (UniqueName: \"kubernetes.io/projected/113e554f-2f68-4d9a-9462-cb366ab6d005-kube-api-access-j8l7s\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.044896 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044790 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.044896 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044866 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovn-node-metrics-cert\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.045052 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044898 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-device-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.045052 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.044935 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysconfig\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.045167 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045072 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.045244 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045085 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-run\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.045244 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.045227 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045247 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-sys\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045284 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045301 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045306 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9qb\" (UniqueName: \"kubernetes.io/projected/e12c6987-366e-4f26-ae6c-75cc6a5d3967-kube-api-access-rm9qb\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045325 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-etc-selinux\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045343 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-modprobe-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045361 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-kubernetes\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045377 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-conf\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045432 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-bin\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045459 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-netd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045486 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-systemd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045530 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045586 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dd52096-bca5-4442-852e-5f41d1bb9827-host\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045606 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dd52096-bca5-4442-852e-5f41d1bb9827-serviceca\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045640 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44nc2\" (UniqueName: \"kubernetes.io/projected/4dd52096-bca5-4442-852e-5f41d1bb9827-kube-api-access-44nc2\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045666 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-log-socket\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045729 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-systemd\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.047301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045769 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-netns\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045810 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045841 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-var-lib-kubelet\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.045906 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-host\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046018 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046043 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046077 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-tuned\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046108 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb991a9-7c10-423a-8330-afafc79edd8c-agent-certs\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046139 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb991a9-7c10-423a-8330-afafc79edd8c-konnectivity-ca\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046208 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-kubelet\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046260 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046278 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-node-log\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046328 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-script-lib\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046369 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-np6qb\"" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046376 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-sys-fs\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046424 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-tmp\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046469 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tlq\" (UniqueName: \"kubernetes.io/projected/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-kube-api-access-p2tlq\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046525 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-ovn\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046570 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.048254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046595 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-registration-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046642 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046678 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-lib-modules\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046742 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-slash\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.046773 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-systemd-units\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.047662 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.048004 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.048194 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vh9w7\"" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.048242 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:19:31.048958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.048636 2584 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:19:31.049417 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.049165 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.051474 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.051458 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dwqz4\"" Apr 17 16:19:31.051630 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.051485 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:19:31.051738 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.051723 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:19:31.051790 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.051741 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:19:31.071765 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.071744 2584 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gmbdd" Apr 17 16:19:31.080314 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.080277 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:14:30 +0000 UTC" deadline="2027-10-09 18:38:26.76652479 +0000 UTC" Apr 17 16:19:31.080314 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.080308 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12962h18m55.686219203s" Apr 17 16:19:31.084319 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.084301 2584 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gmbdd" Apr 17 16:19:31.089422 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.089297 2584 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:31.138930 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.138908 2584 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:19:31.147139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147121 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-var-lib-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147146 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-config\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147169 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-env-overrides\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147184 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8l7s\" (UniqueName: \"kubernetes.io/projected/113e554f-2f68-4d9a-9462-cb366ab6d005-kube-api-access-j8l7s\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.147255 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147235 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-var-lib-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147279 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-daemon-config\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147304 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2w2z\" (UniqueName: \"kubernetes.io/projected/7ae29549-3750-4378-9d33-2e6bfdb368b5-kube-api-access-g2w2z\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147321 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147339 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovn-node-metrics-cert\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147361 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-sys\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147404 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147435 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-sys\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147442 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-socket-dir-parent\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147470 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-tmp\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147489 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tlq\" (UniqueName: \"kubernetes.io/projected/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-kube-api-access-p2tlq\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147533 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-modprobe-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147560 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147586 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147610 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-hostroot\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147633 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-var-lib-kubelet\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147658 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-bin\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147682 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-netd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.147708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147706 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44nc2\" (UniqueName: \"kubernetes.io/projected/4dd52096-bca5-4442-852e-5f41d1bb9827-kube-api-access-44nc2\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147725 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-log-socket\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147745 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-systemd\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147768 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-cnibin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147771 2584 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147787 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-modprobe-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147789 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-conf-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147826 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-multus-certs\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147854 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb991a9-7c10-423a-8330-afafc79edd8c-agent-certs\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147905 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-bin\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147903 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-config\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147959 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-cni-netd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147853 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-log-socket\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148052 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-systemd\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148102 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-var-lib-kubelet\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.148263 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148223 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-kubelet\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.147880 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-kubelet\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148333 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-node-log\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148350 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-sys-fs\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148366 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148389 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwlf\" (UniqueName: \"kubernetes.io/projected/c20048a9-0ed2-477d-9b33-25dc727aeda5-kube-api-access-svwlf\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148416 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148310 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-env-overrides\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148440 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148526 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148442 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-sys-fs\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148551 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-lib-modules\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148547 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-d\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148589 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148436 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-node-log\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148620 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148635 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-lib-modules\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.149171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148646 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-bin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148701 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-systemd-units\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148725 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-systemd-units\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148728 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148759 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-etc-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148785 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-socket-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148812 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5gc\" (UniqueName: \"kubernetes.io/projected/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-kube-api-access-8w5gc\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148828 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-etc-openvswitch\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148839 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-os-release\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148867 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-device-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.148896 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-socket-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.149034 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysconfig\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.149246 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-run\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.150874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.149284 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7gw\" (UniqueName: \"kubernetes.io/projected/dcbdb9dc-df9a-4c0b-850e-370061051a08-kube-api-access-lg7gw\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151069 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovn-node-metrics-cert\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151137 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysconfig\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151173 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-device-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151213 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-cnibin\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151237 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151259 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e898edb3-e2ac-4eca-a223-fa4687085a0e-iptables-alerter-script\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151289 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151310 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9qb\" (UniqueName: \"kubernetes.io/projected/e12c6987-366e-4f26-ae6c-75cc6a5d3967-kube-api-access-rm9qb\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151335 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-etc-selinux\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151358 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-kubernetes\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151385 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-conf\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151401 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-tuned\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151421 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-hosts-file\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.151456 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151448 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-systemd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151470 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dd52096-bca5-4442-852e-5f41d1bb9827-host\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151491 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dd52096-bca5-4442-852e-5f41d1bb9827-serviceca\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151537 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-netns\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151555 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-multus\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151575 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-netns\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151602 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151622 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-host\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151643 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-tmp-dir\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151659 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151685 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-system-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151705 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-k8s-cni-cncf-io\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151728 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxf6\" (UniqueName: \"kubernetes.io/projected/e898edb3-e2ac-4eca-a223-fa4687085a0e-kube-api-access-qkxf6\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151759 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb991a9-7c10-423a-8330-afafc79edd8c-konnectivity-ca\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151783 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-script-lib\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151820 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-ovn\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151854 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-registration-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.152106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151887 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-cni-binary-copy\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151917 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-etc-kubernetes\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151940 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e898edb3-e2ac-4eca-a223-fa4687085a0e-host-slash\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151966 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-slash\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151984 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-os-release\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.151999 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-kubelet\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152088 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152116 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb991a9-7c10-423a-8330-afafc79edd8c-agent-certs\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152221 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-etc-selinux\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152274 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-kubernetes\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152354 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-sysctl-conf\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152407 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-systemd\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152450 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-host\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152464 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dd52096-bca5-4442-852e-5f41d1bb9827-host\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152414 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-run-netns\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152558 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152602 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-host-slash\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152641 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-run\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.153030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152935 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12c6987-366e-4f26-ae6c-75cc6a5d3967-run-ovn\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153872 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152940 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb991a9-7c10-423a-8330-afafc79edd8c-konnectivity-ca\") pod \"konnectivity-agent-v59gz\" (UID: \"ebb991a9-7c10-423a-8330-afafc79edd8c\") " pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.153872 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.152977 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dd52096-bca5-4442-852e-5f41d1bb9827-serviceca\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.153872 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.153460 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12c6987-366e-4f26-ae6c-75cc6a5d3967-ovnkube-script-lib\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.153872 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.153549 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/113e554f-2f68-4d9a-9462-cb366ab6d005-registration-dir\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.153872 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.153781 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-tmp\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.156079 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.156052 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-etc-tuned\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.156396 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.156370 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44nc2\" (UniqueName: \"kubernetes.io/projected/4dd52096-bca5-4442-852e-5f41d1bb9827-kube-api-access-44nc2\") pod \"node-ca-cdxj4\" (UID: \"4dd52096-bca5-4442-852e-5f41d1bb9827\") " pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.156481 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.156406 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tlq\" (UniqueName: \"kubernetes.io/projected/2a639ac9-e8e4-4fc8-b970-ad9917fcbff7-kube-api-access-p2tlq\") pod \"tuned-bbp92\" (UID: \"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7\") " pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.156621 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.156604 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8l7s\" (UniqueName: \"kubernetes.io/projected/113e554f-2f68-4d9a-9462-cb366ab6d005-kube-api-access-j8l7s\") pod \"aws-ebs-csi-driver-node-d8vsn\" (UID: \"113e554f-2f68-4d9a-9462-cb366ab6d005\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.159364 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.159338 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9qb\" (UniqueName: \"kubernetes.io/projected/e12c6987-366e-4f26-ae6c-75cc6a5d3967-kube-api-access-rm9qb\") pod \"ovnkube-node-8qvmb\" (UID: \"e12c6987-366e-4f26-ae6c-75cc6a5d3967\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.161877 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.161854 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod454bf7e88903cb3fed5cc9e7d8cf5d0d.slice/crio-b65325f94797eea6f60e66f35affa5d035f759a92539dda7cd72c25630c4a944 WatchSource:0}: Error finding container b65325f94797eea6f60e66f35affa5d035f759a92539dda7cd72c25630c4a944: Status 404 returned error can't find the container with id b65325f94797eea6f60e66f35affa5d035f759a92539dda7cd72c25630c4a944 Apr 17 16:19:31.162091 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.162071 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod816568b9527d9455f848c001abfac64a.slice/crio-0e44f4dd5249b7ce4b067f2a80355162db4773d85d10bc033bf5c29bc447f371 WatchSource:0}: Error finding container 0e44f4dd5249b7ce4b067f2a80355162db4773d85d10bc033bf5c29bc447f371: Status 404 returned error can't find the container with id 0e44f4dd5249b7ce4b067f2a80355162db4773d85d10bc033bf5c29bc447f371 Apr 17 16:19:31.165755 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.165741 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:19:31.187063 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.187030 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerStarted","Data":"0e44f4dd5249b7ce4b067f2a80355162db4773d85d10bc033bf5c29bc447f371"} Apr 17 16:19:31.187992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.187975 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" event={"ID":"454bf7e88903cb3fed5cc9e7d8cf5d0d","Type":"ContainerStarted","Data":"b65325f94797eea6f60e66f35affa5d035f759a92539dda7cd72c25630c4a944"} Apr 17 16:19:31.236823 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.236796 2584 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:31.253015 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.252992 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-cni-binary-copy\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253099 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253023 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-etc-kubernetes\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253099 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253047 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e898edb3-e2ac-4eca-a223-fa4687085a0e-host-slash\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253097 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e898edb3-e2ac-4eca-a223-fa4687085a0e-host-slash\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253105 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-etc-kubernetes\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253132 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-os-release\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253159 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-kubelet\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253184 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-os-release\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253193 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-daemon-config\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253228 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-kubelet\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253234 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2w2z\" (UniqueName: \"kubernetes.io/projected/7ae29549-3750-4378-9d33-2e6bfdb368b5-kube-api-access-g2w2z\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253265 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-socket-dir-parent\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253295 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253322 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253343 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-socket-dir-parent\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253346 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-hostroot\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253382 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-hostroot\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253399 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-cnibin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253425 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-conf-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253449 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-multus-certs\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253469 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-cnibin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253478 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253492 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-cni-binary-copy\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253518 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svwlf\" (UniqueName: \"kubernetes.io/projected/c20048a9-0ed2-477d-9b33-25dc727aeda5-kube-api-access-svwlf\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253535 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-conf-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253556 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-multus-certs\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.253605 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253580 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253562 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253618 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253644 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-bin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253671 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253685 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-daemon-config\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253699 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-multus-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253708 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-bin\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253699 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5gc\" (UniqueName: \"kubernetes.io/projected/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-kube-api-access-8w5gc\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253754 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-os-release\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.253787 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253814 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7gw\" (UniqueName: \"kubernetes.io/projected/dcbdb9dc-df9a-4c0b-850e-370061051a08-kube-api-access-lg7gw\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.253868 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:31.753826141 +0000 UTC m=+2.127739062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253869 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-os-release\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253890 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-cnibin\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253898 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253924 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254430 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253950 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-cnibin\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253953 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.253968 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e898edb3-e2ac-4eca-a223-fa4687085a0e-iptables-alerter-script\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254009 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-hosts-file\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254029 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-netns\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254045 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-multus\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254060 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-netns\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254065 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-tmp-dir\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254087 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-var-lib-cni-multus\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254097 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254062 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-hosts-file\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254126 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-system-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254175 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-k8s-cni-cncf-io\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254201 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxf6\" (UniqueName: \"kubernetes.io/projected/e898edb3-e2ac-4eca-a223-fa4687085a0e-kube-api-access-qkxf6\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254227 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c20048a9-0ed2-477d-9b33-25dc727aeda5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254249 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-system-cni-dir\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254253 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ae29549-3750-4378-9d33-2e6bfdb368b5-host-run-k8s-cni-cncf-io\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254299 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-tmp-dir\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.254907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254417 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e898edb3-e2ac-4eca-a223-fa4687085a0e-iptables-alerter-script\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.255406 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.254434 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c20048a9-0ed2-477d-9b33-25dc727aeda5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.262398 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.262383 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:31.262452 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.262403 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:31.262452 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.262413 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:31.262550 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.262454 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:31.762441922 +0000 UTC m=+2.136354842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:31.265209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.265182 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwlf\" (UniqueName: \"kubernetes.io/projected/c20048a9-0ed2-477d-9b33-25dc727aeda5-kube-api-access-svwlf\") pod \"multus-additional-cni-plugins-p6896\" (UID: \"c20048a9-0ed2-477d-9b33-25dc727aeda5\") " pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.265643 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.265627 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2w2z\" (UniqueName: \"kubernetes.io/projected/7ae29549-3750-4378-9d33-2e6bfdb368b5-kube-api-access-g2w2z\") pod \"multus-s848g\" (UID: \"7ae29549-3750-4378-9d33-2e6bfdb368b5\") " pod="openshift-multus/multus-s848g" Apr 17 16:19:31.265815 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.265796 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxf6\" (UniqueName: \"kubernetes.io/projected/e898edb3-e2ac-4eca-a223-fa4687085a0e-kube-api-access-qkxf6\") pod \"iptables-alerter-fcm59\" (UID: \"e898edb3-e2ac-4eca-a223-fa4687085a0e\") " pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.265976 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.265959 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7gw\" (UniqueName: \"kubernetes.io/projected/dcbdb9dc-df9a-4c0b-850e-370061051a08-kube-api-access-lg7gw\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.266137 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.266121 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5gc\" (UniqueName: \"kubernetes.io/projected/b2b1e0e2-9eb2-4e4f-9027-c81e854a984c-kube-api-access-8w5gc\") pod \"node-resolver-976rk\" (UID: \"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c\") " pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.371133 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.371110 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:31.376922 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.376898 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb991a9_7c10_423a_8330_afafc79edd8c.slice/crio-d12ca4844af16a4199715dd9f4db6baace986e660115d41d672546d552855f12 WatchSource:0}: Error finding container d12ca4844af16a4199715dd9f4db6baace986e660115d41d672546d552855f12: Status 404 returned error can't find the container with id d12ca4844af16a4199715dd9f4db6baace986e660115d41d672546d552855f12 Apr 17 16:19:31.378116 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.378100 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdxj4" Apr 17 16:19:31.384207 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.384188 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd52096_bca5_4442_852e_5f41d1bb9827.slice/crio-9b52c8cb32e621b4ac49b4bcb9e87c7e93ddf11b46b6d664596f264b5b7a0614 WatchSource:0}: Error finding container 9b52c8cb32e621b4ac49b4bcb9e87c7e93ddf11b46b6d664596f264b5b7a0614: Status 404 returned error can't find the container with id 9b52c8cb32e621b4ac49b4bcb9e87c7e93ddf11b46b6d664596f264b5b7a0614 Apr 17 16:19:31.408915 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.408898 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:31.414105 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.414078 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12c6987_366e_4f26_ae6c_75cc6a5d3967.slice/crio-85bbb0075c1eb2091862ce6a20231f65c8abc94ef3ccaf1ba86d585ef47e1a2e WatchSource:0}: Error finding container 85bbb0075c1eb2091862ce6a20231f65c8abc94ef3ccaf1ba86d585ef47e1a2e: Status 404 returned error can't find the container with id 85bbb0075c1eb2091862ce6a20231f65c8abc94ef3ccaf1ba86d585ef47e1a2e Apr 17 16:19:31.424967 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.424941 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" Apr 17 16:19:31.430491 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.430473 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bbp92" Apr 17 16:19:31.432109 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.432077 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113e554f_2f68_4d9a_9462_cb366ab6d005.slice/crio-97008a41e799f587d4f2e160a39371c22debf7aba40ec7cbba02ca54e226d3fd WatchSource:0}: Error finding container 97008a41e799f587d4f2e160a39371c22debf7aba40ec7cbba02ca54e226d3fd: Status 404 returned error can't find the container with id 97008a41e799f587d4f2e160a39371c22debf7aba40ec7cbba02ca54e226d3fd Apr 17 16:19:31.471078 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.471052 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-976rk" Apr 17 16:19:31.476673 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.476650 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b1e0e2_9eb2_4e4f_9027_c81e854a984c.slice/crio-af832b704c65c4e848bbc4471dd0e2277e6d395f3cf497830941a87d11aeae04 WatchSource:0}: Error finding container af832b704c65c4e848bbc4471dd0e2277e6d395f3cf497830941a87d11aeae04: Status 404 returned error can't find the container with id af832b704c65c4e848bbc4471dd0e2277e6d395f3cf497830941a87d11aeae04 Apr 17 16:19:31.476747 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.476712 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6896" Apr 17 16:19:31.482554 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.482530 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc20048a9_0ed2_477d_9b33_25dc727aeda5.slice/crio-f0f70b22ce41c4891ce9fe17be1944c6bca492f75a8684726e60470cad9c9453 WatchSource:0}: Error finding container f0f70b22ce41c4891ce9fe17be1944c6bca492f75a8684726e60470cad9c9453: Status 404 returned error can't find the container with id f0f70b22ce41c4891ce9fe17be1944c6bca492f75a8684726e60470cad9c9453 Apr 17 16:19:31.482631 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.482586 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s848g" Apr 17 16:19:31.488567 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.488551 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fcm59" Apr 17 16:19:31.488790 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.488774 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae29549_3750_4378_9d33_2e6bfdb368b5.slice/crio-97011367ca8471e37320a984112534a67cc4070654aec4e38a39233fe7fa0d8f WatchSource:0}: Error finding container 97011367ca8471e37320a984112534a67cc4070654aec4e38a39233fe7fa0d8f: Status 404 returned error can't find the container with id 97011367ca8471e37320a984112534a67cc4070654aec4e38a39233fe7fa0d8f Apr 17 16:19:31.494350 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:19:31.494333 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode898edb3_e2ac_4eca_a223_fa4687085a0e.slice/crio-8f6c18fc596954e3c731cda3010bcc2b4f402bfc569b6d7bd52421252306b7bd WatchSource:0}: Error finding container 8f6c18fc596954e3c731cda3010bcc2b4f402bfc569b6d7bd52421252306b7bd: Status 404 returned error can't find the container with id 8f6c18fc596954e3c731cda3010bcc2b4f402bfc569b6d7bd52421252306b7bd Apr 17 16:19:31.758878 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.758793 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:31.759025 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.758979 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:31.759091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.759047 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:32.75902781 +0000 UTC m=+3.132940742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:31.859630 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.859587 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:31.859811 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.859795 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:31.859859 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.859818 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:31.859859 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.859831 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:31.859930 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:31.859888 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:32.859868798 +0000 UTC m=+3.233781735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:31.892212 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:31.892187 2584 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:32.085487 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.085319 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:14:31 +0000 UTC" deadline="2028-01-09 09:18:21.220807844 +0000 UTC" Apr 17 16:19:32.085487 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.085351 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15160h58m49.135461037s" Apr 17 16:19:32.185600 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.185573 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:32.185755 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.185712 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:32.194797 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.194761 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerStarted","Data":"f0f70b22ce41c4891ce9fe17be1944c6bca492f75a8684726e60470cad9c9453"} Apr 17 16:19:32.202248 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.202212 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" event={"ID":"113e554f-2f68-4d9a-9462-cb366ab6d005","Type":"ContainerStarted","Data":"97008a41e799f587d4f2e160a39371c22debf7aba40ec7cbba02ca54e226d3fd"} Apr 17 16:19:32.215652 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.215619 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"85bbb0075c1eb2091862ce6a20231f65c8abc94ef3ccaf1ba86d585ef47e1a2e"} Apr 17 16:19:32.242538 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.240069 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdxj4" event={"ID":"4dd52096-bca5-4442-852e-5f41d1bb9827","Type":"ContainerStarted","Data":"9b52c8cb32e621b4ac49b4bcb9e87c7e93ddf11b46b6d664596f264b5b7a0614"} Apr 17 16:19:32.242538 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.241862 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v59gz" event={"ID":"ebb991a9-7c10-423a-8330-afafc79edd8c","Type":"ContainerStarted","Data":"d12ca4844af16a4199715dd9f4db6baace986e660115d41d672546d552855f12"} Apr 17 16:19:32.251670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.251635 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s848g" event={"ID":"7ae29549-3750-4378-9d33-2e6bfdb368b5","Type":"ContainerStarted","Data":"97011367ca8471e37320a984112534a67cc4070654aec4e38a39233fe7fa0d8f"} Apr 17 16:19:32.256573 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.256527 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-976rk" event={"ID":"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c","Type":"ContainerStarted","Data":"af832b704c65c4e848bbc4471dd0e2277e6d395f3cf497830941a87d11aeae04"} Apr 17 16:19:32.265328 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.265294 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bbp92" event={"ID":"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7","Type":"ContainerStarted","Data":"6c258020308cf98d66fc7054451d2ff717e45e32a4b9b6431c5358d3a67fe12b"} Apr 17 16:19:32.266514 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.266470 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fcm59" event={"ID":"e898edb3-e2ac-4eca-a223-fa4687085a0e","Type":"ContainerStarted","Data":"8f6c18fc596954e3c731cda3010bcc2b4f402bfc569b6d7bd52421252306b7bd"} Apr 17 16:19:32.767887 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.767852 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:32.768079 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.768006 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:32.768135 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.768079 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:34.768060897 +0000 UTC m=+5.141973818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:32.869418 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:32.868750 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:32.869418 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.868937 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:32.869418 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.868971 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:32.869418 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.868985 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:32.869418 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:32.869046 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:34.869027791 +0000 UTC m=+5.242940725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:33.086711 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:33.086663 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:14:31 +0000 UTC" deadline="2027-10-13 01:36:24.568870084 +0000 UTC" Apr 17 16:19:33.086711 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:33.086708 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13041h16m51.482165937s" Apr 17 16:19:33.185449 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:33.184962 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:33.185449 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:33.185087 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:33.942860 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:33.942622 2584 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:34.194129 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:34.193525 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:34.194129 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.193664 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:34.785674 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:34.785637 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:34.785850 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.785807 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:34.785904 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.785870 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:38.785852783 +0000 UTC m=+9.159765722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:34.886998 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:34.886363 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:34.886998 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.886568 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:34.886998 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.886586 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:34.886998 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.886599 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:34.886998 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:34.886657 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:38.886639033 +0000 UTC m=+9.260551971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:35.185789 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:35.185706 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:35.185945 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:35.185830 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:36.188739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:36.188707 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:36.189254 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:36.188845 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:37.184881 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:37.184848 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:37.185058 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:37.184985 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:38.188320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:38.188277 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:38.188786 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.188427 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:38.815377 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:38.815339 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:38.815581 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.815487 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:38.815581 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.815580 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:46.815560653 +0000 UTC m=+17.189473591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:38.916379 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:38.916263 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:38.916646 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.916446 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:38.916646 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.916470 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:38.916646 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.916483 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:38.916646 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:38.916596 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:46.916577385 +0000 UTC m=+17.290490322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:39.185299 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:39.185215 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:39.185459 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:39.185352 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:40.185392 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:40.185359 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:40.185751 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:40.185448 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:41.184713 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:41.184681 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:41.184901 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:41.184803 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:42.185358 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:42.185317 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:42.185843 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:42.185446 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:43.184994 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.184953 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:43.185162 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:43.185084 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:43.400905 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.400873 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qhfkp"] Apr 17 16:19:43.425397 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.425368 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.425562 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:43.425439 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:43.550972 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.550923 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-dbus\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.551137 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.551005 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.551137 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.551062 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-kubelet-config\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651560 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.651518 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-kubelet-config\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651732 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.651588 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-dbus\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651732 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.651632 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651732 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.651633 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-kubelet-config\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651871 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:43.651795 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-dbus\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:43.651871 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:43.651828 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:43.651940 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:43.651900 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:44.151880861 +0000 UTC m=+14.525793785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:44.154516 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:44.154461 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:44.154691 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:44.154630 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:44.154763 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:44.154710 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:45.15469015 +0000 UTC m=+15.528603072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:44.185646 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:44.185616 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:44.185815 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:44.185731 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:45.160484 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:45.160448 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:45.160874 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:45.160572 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:45.160874 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:45.160646 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:47.160627489 +0000 UTC m=+17.534540413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:45.185449 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:45.185418 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:45.185596 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:45.185423 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:45.185596 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:45.185542 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:45.185686 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:45.185644 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:46.184763 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:46.184728 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:46.185194 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.184873 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:46.875161 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:46.875130 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:46.875376 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.875300 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:46.875376 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.875373 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:02.875352404 +0000 UTC m=+33.249265340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:46.976477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:46.976436 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:46.976675 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.976633 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:46.976675 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.976662 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:46.976766 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.976677 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:46.976766 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:46.976739 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:02.976721119 +0000 UTC m=+33.350634042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:47.177713 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:47.177637 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:47.177871 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:47.177759 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:47.177871 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:47.177824 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:51.177810372 +0000 UTC m=+21.551723297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:47.184846 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:47.184821 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:47.185205 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:47.184864 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:47.185205 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:47.184960 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:47.185205 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:47.185062 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:48.185351 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:48.185318 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:48.185948 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:48.185464 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:49.185167 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:49.185134 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:49.185167 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:49.185154 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:49.185362 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:49.185231 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:49.185401 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:49.185376 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:50.186092 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.186057 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:50.186614 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:50.186165 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:50.314785 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.314553 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" event={"ID":"454bf7e88903cb3fed5cc9e7d8cf5d0d","Type":"ContainerStarted","Data":"5de3b57b2d822d92b147e78557620761e568efc49555bee88e3acc42fb21f68e"} Apr 17 16:19:50.326980 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.326928 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" podStartSLOduration=20.3268917 podStartE2EDuration="20.3268917s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:19:50.326787108 +0000 UTC m=+20.700700052" watchObservedRunningTime="2026-04-17 16:19:50.3268917 +0000 UTC m=+20.700804643" Apr 17 16:19:50.327907 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.327877 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"61b2d2972f95b42c88301a29148488998e6c6d19bd49cb6e5e60c873226ad270"} Apr 17 16:19:50.327992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.327918 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"a9e82c4512026431d127fa623d29f672a2875c5cfd7dfa225fb0c812d20f9dd2"} Apr 17 16:19:50.327992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.327934 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"64aa0c166b656d05d5418238bfc57c2a14bb92be5b02b5f7e78e21e079c0d17a"} Apr 17 16:19:50.327992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.327946 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"c2402dc63face3ed023974440b6b7510033c4fd18fa44c0237d08ecf7c51f6ca"} Apr 17 16:19:50.327992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.327958 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"386ae5364ba6c5392cad2b951643e4bfa4d3e6eee7e9cf91f9dd6aea2cf81925"} Apr 17 16:19:50.331174 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.330634 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s848g" event={"ID":"7ae29549-3750-4378-9d33-2e6bfdb368b5","Type":"ContainerStarted","Data":"31cc1f89ee00d02c9f7b2942a59cec09838761c5ff09f0abe041f0e16fe5b184"} Apr 17 16:19:50.332299 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.332273 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bbp92" event={"ID":"2a639ac9-e8e4-4fc8-b970-ad9917fcbff7","Type":"ContainerStarted","Data":"419189da68eb24e83d5fb04140857b357fd9225c4832057fbd5a13a98298d3d4"} Apr 17 16:19:50.343648 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.343597 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s848g" podStartSLOduration=2.15171596 podStartE2EDuration="20.343582116s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.491186704 +0000 UTC m=+1.865099625" lastFinishedPulling="2026-04-17 16:19:49.683052857 +0000 UTC m=+20.056965781" observedRunningTime="2026-04-17 16:19:50.342995708 +0000 UTC m=+20.716908652" watchObservedRunningTime="2026-04-17 16:19:50.343582116 +0000 UTC m=+20.717495058" Apr 17 16:19:50.360815 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:50.360690 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bbp92" podStartSLOduration=2.4718258300000002 podStartE2EDuration="20.360673262s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.439526798 +0000 UTC m=+1.813439720" lastFinishedPulling="2026-04-17 16:19:49.328374227 +0000 UTC m=+19.702287152" observedRunningTime="2026-04-17 16:19:50.360672789 +0000 UTC m=+20.734585732" watchObservedRunningTime="2026-04-17 16:19:50.360673262 +0000 UTC m=+20.734586206" Apr 17 16:19:51.185070 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.184856 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:51.185237 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.184861 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:51.185237 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:51.185172 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:51.185333 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:51.185242 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:51.207296 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.207267 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:51.207800 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:51.207437 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:51.207800 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:51.207518 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:59.207481608 +0000 UTC m=+29.581394528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:51.336242 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.336200 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-976rk" event={"ID":"b2b1e0e2-9eb2-4e4f-9027-c81e854a984c","Type":"ContainerStarted","Data":"bc7c67ae489abaf8c5db50bb6351043d9f0014988938d859123d3ee93e979c9a"} Apr 17 16:19:51.337548 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.337518 2584 generic.go:358] "Generic (PLEG): container finished" podID="816568b9527d9455f848c001abfac64a" containerID="b43a9f58f94ad84ef5c4b9bd2e6444cf69c0563585220fa3781bad619ee5b25d" exitCode=0 Apr 17 16:19:51.337663 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.337592 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerDied","Data":"b43a9f58f94ad84ef5c4b9bd2e6444cf69c0563585220fa3781bad619ee5b25d"} Apr 17 16:19:51.338914 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.338894 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fcm59" event={"ID":"e898edb3-e2ac-4eca-a223-fa4687085a0e","Type":"ContainerStarted","Data":"e3ed303841ab318589afc1e350d8eb9fed53abb1da92a8323ea8d2230c364351"} Apr 17 16:19:51.340154 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.340137 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="1d9d0a37b523327bac93bd6994597e8c2833be05035a9d7f37b57f9b1496c5c2" exitCode=0 Apr 17 16:19:51.340226 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.340190 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"1d9d0a37b523327bac93bd6994597e8c2833be05035a9d7f37b57f9b1496c5c2"} Apr 17 16:19:51.341560 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.341476 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" event={"ID":"113e554f-2f68-4d9a-9462-cb366ab6d005","Type":"ContainerStarted","Data":"67744e49a85f5c480cfaf8aadc7b969e4be46acb6c257ee0638e5c329a05ffa5"} Apr 17 16:19:51.344249 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.344228 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"a94c5097b3896c8a6484bad4ea6c86f14714e283ee4fbdab0874034bb0d04b63"} Apr 17 16:19:51.345431 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.345408 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdxj4" event={"ID":"4dd52096-bca5-4442-852e-5f41d1bb9827","Type":"ContainerStarted","Data":"f15c3b71242bacf295f83058c733c407cd7436ade547c7a26fc23b6d8b47971f"} Apr 17 16:19:51.346489 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.346470 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v59gz" event={"ID":"ebb991a9-7c10-423a-8330-afafc79edd8c","Type":"ContainerStarted","Data":"01406fd176e1814220abfbb6ab161e2c3fbda2a7ca7f41b50ffe73ed48174d75"} Apr 17 16:19:51.351610 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.350726 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-976rk" podStartSLOduration=3.178153077 podStartE2EDuration="21.350710526s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.478033574 +0000 UTC m=+1.851946500" lastFinishedPulling="2026-04-17 16:19:49.650591024 +0000 UTC m=+20.024503949" observedRunningTime="2026-04-17 16:19:51.349843381 +0000 UTC m=+21.723756315" watchObservedRunningTime="2026-04-17 16:19:51.350710526 +0000 UTC m=+21.724623469" Apr 17 16:19:51.380160 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.380120 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v59gz" podStartSLOduration=3.108082047 podStartE2EDuration="21.380106456s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.378538949 +0000 UTC m=+1.752451868" lastFinishedPulling="2026-04-17 16:19:49.650563341 +0000 UTC m=+20.024476277" observedRunningTime="2026-04-17 16:19:51.379670818 +0000 UTC m=+21.753583760" watchObservedRunningTime="2026-04-17 16:19:51.380106456 +0000 UTC m=+21.754019398" Apr 17 16:19:51.391653 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.391609 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fcm59" podStartSLOduration=3.558952171 podStartE2EDuration="21.391595418s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.495732716 +0000 UTC m=+1.869645636" lastFinishedPulling="2026-04-17 16:19:49.32837596 +0000 UTC m=+19.702288883" observedRunningTime="2026-04-17 16:19:51.391335609 +0000 UTC m=+21.765248552" watchObservedRunningTime="2026-04-17 16:19:51.391595418 +0000 UTC m=+21.765508361" Apr 17 16:19:51.413862 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.413818 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cdxj4" podStartSLOduration=3.4709938 podStartE2EDuration="21.413803685s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.385603085 +0000 UTC m=+1.759516008" lastFinishedPulling="2026-04-17 16:19:49.328412971 +0000 UTC m=+19.702325893" observedRunningTime="2026-04-17 16:19:51.413187044 +0000 UTC m=+21.787099987" watchObservedRunningTime="2026-04-17 16:19:51.413803685 +0000 UTC m=+21.787716629" Apr 17 16:19:51.510273 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:51.510248 2584 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:19:52.109473 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.109370 2584 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:19:51.510268026Z","UUID":"76503ae1-8520-44af-b57d-21de458cd41f","Handler":null,"Name":"","Endpoint":""} Apr 17 16:19:52.111492 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.111116 2584 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:19:52.111492 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.111145 2584 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:19:52.185254 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.185221 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:52.185410 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:52.185330 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:52.350266 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.350229 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerStarted","Data":"990d6a9e291b057e141776436f622ac16fbb3c3ab5d2dfa48e6fb1451b6aca0b"} Apr 17 16:19:52.353325 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.353291 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" event={"ID":"113e554f-2f68-4d9a-9462-cb366ab6d005","Type":"ContainerStarted","Data":"973f2cba6f1f436bdd2fc64dd1e809fa38a660588445ec811f1955c216a0aa63"} Apr 17 16:19:52.372166 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:52.372083 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" podStartSLOduration=22.372063805 podStartE2EDuration="22.372063805s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:19:52.37143439 +0000 UTC m=+22.745347344" watchObservedRunningTime="2026-04-17 16:19:52.372063805 +0000 UTC m=+22.745976747" Apr 17 16:19:53.185358 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:53.185327 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:53.185482 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:53.185366 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:53.185482 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:53.185430 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:53.185586 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:53.185481 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:53.356651 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:53.356616 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" event={"ID":"113e554f-2f68-4d9a-9462-cb366ab6d005","Type":"ContainerStarted","Data":"a12786e86d2f0f3498973d320e8d831760ddee7f4990181712b455cfc5e2d840"} Apr 17 16:19:53.359958 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:53.359926 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"5cf78c163dbb03e989a69a656b49cc4d169ecf48db04c1324d7a0c69dc2f94a0"} Apr 17 16:19:53.372822 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:53.372776 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d8vsn" podStartSLOduration=2.358548902 podStartE2EDuration="23.372765477s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.433399262 +0000 UTC m=+1.807312183" lastFinishedPulling="2026-04-17 16:19:52.44761582 +0000 UTC m=+22.821528758" observedRunningTime="2026-04-17 16:19:53.372279366 +0000 UTC m=+23.746192308" watchObservedRunningTime="2026-04-17 16:19:53.372765477 +0000 UTC m=+23.746678419" Apr 17 16:19:54.057146 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:54.057109 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:54.057712 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:54.057696 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:54.184876 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:54.184841 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:54.185018 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:54.184939 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:54.362232 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:54.362018 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:54.362645 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:54.362349 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v59gz" Apr 17 16:19:55.185610 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.185581 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:55.185835 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.185581 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:55.185835 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:55.185679 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:55.185835 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:55.185752 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:55.368861 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.368823 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" event={"ID":"e12c6987-366e-4f26-ae6c-75cc6a5d3967","Type":"ContainerStarted","Data":"32e0efdc560eba3b5b7acdd7692d0024934ac2783245ef79b479de43bdc8030b"} Apr 17 16:19:55.369560 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.369072 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:55.369560 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.369101 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:55.383766 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.383745 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:55.397289 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:55.397250 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" podStartSLOduration=6.978437158 podStartE2EDuration="25.397240606s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.415856902 +0000 UTC m=+1.789769826" lastFinishedPulling="2026-04-17 16:19:49.834660335 +0000 UTC m=+20.208573274" observedRunningTime="2026-04-17 16:19:55.395867826 +0000 UTC m=+25.769780768" watchObservedRunningTime="2026-04-17 16:19:55.397240606 +0000 UTC m=+25.771153547" Apr 17 16:19:56.185632 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.185599 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:56.185825 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:56.185740 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:56.371552 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.371360 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:56.386931 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.386900 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:19:56.716184 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.716148 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m64jw"] Apr 17 16:19:56.716363 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.716287 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:56.716426 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:56.716380 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:56.719180 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.719142 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhfkp"] Apr 17 16:19:56.719303 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.719246 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:56.719380 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:56.719354 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:56.719878 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.719856 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cgrms"] Apr 17 16:19:56.720042 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:56.719955 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:56.720223 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:56.720041 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:58.184667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:58.184635 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:19:58.185024 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:58.184636 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:58.185024 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:58.184839 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:19:58.185024 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:58.184950 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:19:59.185351 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:59.185327 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:19:59.185731 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:59.185427 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:19:59.272474 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:59.272443 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:19:59.272606 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:59.272588 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:59.272652 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:19:59.272643 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret podName:bace55ce-7fc7-4b76-82f8-0f8250ee98a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:15.272629663 +0000 UTC m=+45.646542582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret") pod "global-pull-secret-syncer-qhfkp" (UID: "bace55ce-7fc7-4b76-82f8-0f8250ee98a7") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:19:59.377274 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:59.377186 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="9b92efa47991246308eddf47d7bdc3bcd22a4353461a05603403ac6d272f982e" exitCode=0 Apr 17 16:19:59.377274 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:19:59.377243 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"9b92efa47991246308eddf47d7bdc3bcd22a4353461a05603403ac6d272f982e"} Apr 17 16:20:00.186715 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:00.186679 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:20:00.187139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:00.186720 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:00.187139 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:00.186810 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:20:00.187139 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:00.186958 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:20:00.380350 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:00.380325 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerStarted","Data":"78561d8be30e3e6fdf58d7512b06e1e854d8997d8b54483dbff5a10cf72644fb"} Apr 17 16:20:01.184953 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:01.184912 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:01.185096 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:01.185056 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m64jw" podUID="a750395e-b985-4d27-bd7c-fc1fdea00304" Apr 17 16:20:01.385965 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:01.385929 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="78561d8be30e3e6fdf58d7512b06e1e854d8997d8b54483dbff5a10cf72644fb" exitCode=0 Apr 17 16:20:01.386342 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:01.386004 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"78561d8be30e3e6fdf58d7512b06e1e854d8997d8b54483dbff5a10cf72644fb"} Apr 17 16:20:02.185467 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.185438 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:02.185664 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.185438 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:20:02.185664 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.185580 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhfkp" podUID="bace55ce-7fc7-4b76-82f8-0f8250ee98a7" Apr 17 16:20:02.185664 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.185634 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:20:02.897600 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.897569 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:20:02.897972 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.897672 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:02.897972 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.897718 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:34.897704494 +0000 UTC m=+65.271617414 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:02.952144 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.952121 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeReady" Apr 17 16:20:02.952266 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.952233 2584 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:20:02.984408 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.984378 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:20:02.987124 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.987110 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:02.989591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.989572 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:20:02.989672 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.989649 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5mbwd\"" Apr 17 16:20:02.989874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.989858 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:20:02.989931 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.989914 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:20:02.994751 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.994448 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:20:02.994988 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.994974 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5sm7s"] Apr 17 16:20:02.997644 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.997629 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:02.998067 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.998043 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:02.998214 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.998198 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:02.998265 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.998220 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:02.998265 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.998233 2584 projected.go:194] Error preparing data for projected volume kube-api-access-tcnpj for pod openshift-network-diagnostics/network-check-target-m64jw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:02.998341 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:02.998285 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj podName:a750395e-b985-4d27-bd7c-fc1fdea00304 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:34.998268368 +0000 UTC m=+65.372181288 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcnpj" (UniqueName: "kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj") pod "network-check-target-m64jw" (UID: "a750395e-b985-4d27-bd7c-fc1fdea00304") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:02.998447 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.998425 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:20:02.999954 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.999937 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:20:03.000029 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:02.999942 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:20:03.009418 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.009399 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:20:03.009530 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.009444 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:20:03.010160 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.010141 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5sm7s"] Apr 17 16:20:03.096997 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.094640 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ddhcx"] Apr 17 16:20:03.098401 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098381 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098471 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098413 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098471 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098429 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.098471 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098438 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crc5m\" (UniqueName: \"kubernetes.io/projected/93428031-c4fe-4f1f-a088-b038195cf17e-kube-api-access-crc5m\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.098609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098540 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.098609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098598 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098706 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098622 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098706 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098645 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098706 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098680 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnh4l\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098822 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098708 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.098822 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.098742 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.100966 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.100937 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:20:03.100966 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.100960 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:20:03.101157 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.101022 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:20:03.105853 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.105834 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ddhcx"] Apr 17 16:20:03.185416 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.185347 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:03.188066 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.188048 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:20:03.188135 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.188067 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:20:03.188135 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.188053 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:20:03.198988 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.198967 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crc5m\" (UniqueName: \"kubernetes.io/projected/93428031-c4fe-4f1f-a088-b038195cf17e-kube-api-access-crc5m\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.199070 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199006 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.199070 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199033 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmp9w\" (UniqueName: \"kubernetes.io/projected/934467f3-270b-4b90-b4e8-331914b57c8d-kube-api-access-xmp9w\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.199070 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199052 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199070 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199067 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199083 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199104 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnh4l\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.199112 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199121 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199143 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.199167 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:03.699149526 +0000 UTC m=+34.073062450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.199188 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.199207 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:03.199235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199201 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/934467f3-270b-4b90-b4e8-331914b57c8d-config-volume\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.199244 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:03.699233114 +0000 UTC m=+34.073146034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199284 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199312 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/934467f3-270b-4b90-b4e8-331914b57c8d-tmp-dir\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199346 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199442 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.199665 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199627 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.199949 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.199890 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.200072 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.200055 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.203114 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.203095 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.203193 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.203099 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.209824 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.209800 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crc5m\" (UniqueName: \"kubernetes.io/projected/93428031-c4fe-4f1f-a088-b038195cf17e-kube-api-access-crc5m\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.210140 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.210117 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.210679 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.210663 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnh4l\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.299903 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.299857 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmp9w\" (UniqueName: \"kubernetes.io/projected/934467f3-270b-4b90-b4e8-331914b57c8d-kube-api-access-xmp9w\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.300019 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.299925 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/934467f3-270b-4b90-b4e8-331914b57c8d-config-volume\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.300019 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.299943 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/934467f3-270b-4b90-b4e8-331914b57c8d-tmp-dir\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.300019 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.299968 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.300156 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.300060 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:03.300156 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.300146 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:03.800129765 +0000 UTC m=+34.174042685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:03.300388 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.300370 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/934467f3-270b-4b90-b4e8-331914b57c8d-tmp-dir\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.300479 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.300459 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/934467f3-270b-4b90-b4e8-331914b57c8d-config-volume\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.308234 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.308217 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmp9w\" (UniqueName: \"kubernetes.io/projected/934467f3-270b-4b90-b4e8-331914b57c8d-kube-api-access-xmp9w\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.391312 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.391288 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="e9aff32e64c1b17dd344c903fa0cc886f96ca8574a7d245c4243b70724c42acb" exitCode=0 Apr 17 16:20:03.391429 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.391332 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"e9aff32e64c1b17dd344c903fa0cc886f96ca8574a7d245c4243b70724c42acb"} Apr 17 16:20:03.702580 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.702548 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:03.702752 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.702594 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:03.702752 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.702702 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:03.702752 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.702726 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:03.702752 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.702737 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:03.702877 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.702769 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:04.702752043 +0000 UTC m=+35.076664982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:03.702877 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.702785 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:04.702777747 +0000 UTC m=+35.076690670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:03.803766 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:03.803728 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:03.803926 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.803838 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:03.803926 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:03.803897 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:04.803882062 +0000 UTC m=+35.177794982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:04.188333 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.188304 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:20:04.188783 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.188304 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:04.190780 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.190760 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:20:04.190910 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.190803 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:20:04.190910 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.190828 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:20:04.710608 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.710575 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:04.710752 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.710618 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:04.710752 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.710730 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:04.710825 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.710761 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:04.710825 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.710773 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:04.710825 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.710796 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:06.710778625 +0000 UTC m=+37.084691565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:04.710825 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.710812 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:06.710802744 +0000 UTC m=+37.084715683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:04.811476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:04.811444 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:04.811603 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.811592 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:04.811649 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:04.811645 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:06.811632521 +0000 UTC m=+37.185545441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:06.724787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:06.724749 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:06.724795 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.724898 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.724913 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.724924 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.724990 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:10.724964758 +0000 UTC m=+41.098877677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:06.725266 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.725006 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:10.725000539 +0000 UTC m=+41.098913459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:06.826095 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:06.826066 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:06.826236 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.826210 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:06.826275 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:06.826270 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:10.826255238 +0000 UTC m=+41.200168158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:08.671209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.671177 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9"] Apr 17 16:20:08.694132 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.694110 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9"] Apr 17 16:20:08.694132 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.694133 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8"] Apr 17 16:20:08.694296 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.694254 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.696641 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.696619 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:20:08.696641 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.696633 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:20:08.696840 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.696822 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cjp2k\"" Apr 17 16:20:08.697783 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.697764 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:20:08.697895 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.697881 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 16:20:08.714030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.713801 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn"] Apr 17 16:20:08.714138 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.714077 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.716577 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.716557 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:20:08.735005 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.734988 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8"] Apr 17 16:20:08.735005 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.735007 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn"] Apr 17 16:20:08.735142 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.735103 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.737610 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.737586 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:20:08.737610 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.737600 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:20:08.737765 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.737609 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:20:08.737765 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.737689 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:20:08.840475 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840442 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djk5z\" (UniqueName: \"kubernetes.io/projected/340231c1-3824-44e5-baf9-ae36a08a198b-kube-api-access-djk5z\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840616 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840484 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840616 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840594 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3406f4a9-353e-4f09-9d82-56ea350d37e8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.840703 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840618 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzjc\" (UniqueName: \"kubernetes.io/projected/3406f4a9-353e-4f09-9d82-56ea350d37e8-kube-api-access-spzjc\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.840703 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840650 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ebac6b61-ba93-4d25-b409-4b535683b067-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.840703 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840676 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvxd\" (UniqueName: \"kubernetes.io/projected/ebac6b61-ba93-4d25-b409-4b535683b067-kube-api-access-tsvxd\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.840703 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840699 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840756 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840778 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/340231c1-3824-44e5-baf9-ae36a08a198b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840801 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.840855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.840820 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebac6b61-ba93-4d25-b409-4b535683b067-tmp\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.942151 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942065 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.942151 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942118 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebac6b61-ba93-4d25-b409-4b535683b067-tmp\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.942362 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942188 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djk5z\" (UniqueName: \"kubernetes.io/projected/340231c1-3824-44e5-baf9-ae36a08a198b-kube-api-access-djk5z\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.942362 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942222 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.942462 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942389 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3406f4a9-353e-4f09-9d82-56ea350d37e8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.942462 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942422 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spzjc\" (UniqueName: \"kubernetes.io/projected/3406f4a9-353e-4f09-9d82-56ea350d37e8-kube-api-access-spzjc\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.942598 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942470 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ebac6b61-ba93-4d25-b409-4b535683b067-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.942598 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942555 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvxd\" (UniqueName: \"kubernetes.io/projected/ebac6b61-ba93-4d25-b409-4b535683b067-kube-api-access-tsvxd\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.942598 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942585 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.942750 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942630 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.942750 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.942654 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/340231c1-3824-44e5-baf9-ae36a08a198b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.945814 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.945782 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3406f4a9-353e-4f09-9d82-56ea350d37e8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.946323 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.946304 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebac6b61-ba93-4d25-b409-4b535683b067-tmp\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.948717 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.948692 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ebac6b61-ba93-4d25-b409-4b535683b067-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.950783 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.950763 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvxd\" (UniqueName: \"kubernetes.io/projected/ebac6b61-ba93-4d25-b409-4b535683b067-kube-api-access-tsvxd\") pod \"klusterlet-addon-workmgr-6dc4f7cb58-54dh8\" (UID: \"ebac6b61-ba93-4d25-b409-4b535683b067\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:08.950878 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.950791 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzjc\" (UniqueName: \"kubernetes.io/projected/3406f4a9-353e-4f09-9d82-56ea350d37e8-kube-api-access-spzjc\") pod \"managed-serviceaccount-addon-agent-5694b46cd9-dpcf9\" (UID: \"3406f4a9-353e-4f09-9d82-56ea350d37e8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:08.955432 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.955339 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/340231c1-3824-44e5-baf9-ae36a08a198b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.955432 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.955346 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.955432 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.955397 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.955805 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.955782 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-ca\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.955971 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.955932 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/340231c1-3824-44e5-baf9-ae36a08a198b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:08.956078 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:08.956016 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djk5z\" (UniqueName: \"kubernetes.io/projected/340231c1-3824-44e5-baf9-ae36a08a198b-kube-api-access-djk5z\") pod \"cluster-proxy-proxy-agent-7bcc89b8f-zclrn\" (UID: \"340231c1-3824-44e5-baf9-ae36a08a198b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:09.015366 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.015334 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" Apr 17 16:20:09.033311 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.033248 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:09.044949 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.044930 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:20:09.245524 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.244735 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8"] Apr 17 16:20:09.246249 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.246097 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn"] Apr 17 16:20:09.247687 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.247551 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9"] Apr 17 16:20:09.250569 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:20:09.250539 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3406f4a9_353e_4f09_9d82_56ea350d37e8.slice/crio-7e9f645651af3fa2023a5601f410aa78672aeb71062cbafd2569cbe8294cc1dd WatchSource:0}: Error finding container 7e9f645651af3fa2023a5601f410aa78672aeb71062cbafd2569cbe8294cc1dd: Status 404 returned error can't find the container with id 7e9f645651af3fa2023a5601f410aa78672aeb71062cbafd2569cbe8294cc1dd Apr 17 16:20:09.402221 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.402183 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" event={"ID":"ebac6b61-ba93-4d25-b409-4b535683b067","Type":"ContainerStarted","Data":"ed42ffb18d5be54139445f498a2d66cc8986198cc18c548f693eb1adce8e9843"} Apr 17 16:20:09.403159 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.403132 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" event={"ID":"3406f4a9-353e-4f09-9d82-56ea350d37e8","Type":"ContainerStarted","Data":"7e9f645651af3fa2023a5601f410aa78672aeb71062cbafd2569cbe8294cc1dd"} Apr 17 16:20:09.404080 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:09.404057 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerStarted","Data":"65300ce668b4280dbee623271c8944e9fd1641326fbe5b3a4e8e82ed2e9049e0"} Apr 17 16:20:10.758585 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:10.758547 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:10.759176 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:10.758613 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:10.759176 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.758739 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:10.759176 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.758753 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:10.759176 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.758819 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:18.758797115 +0000 UTC m=+49.132710052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:10.759459 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.759233 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:10.759459 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.759278 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:18.759264276 +0000 UTC m=+49.133177197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:10.860032 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:10.859992 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:10.860264 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.860188 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:10.860264 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:10.860252 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:18.860232445 +0000 UTC m=+49.234145377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:15.297856 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:15.297801 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:15.300530 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:15.300482 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bace55ce-7fc7-4b76-82f8-0f8250ee98a7-original-pull-secret\") pod \"global-pull-secret-syncer-qhfkp\" (UID: \"bace55ce-7fc7-4b76-82f8-0f8250ee98a7\") " pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:15.302379 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:15.302352 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfkp" Apr 17 16:20:17.276609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.276562 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhfkp"] Apr 17 16:20:17.423436 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.423402 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerStarted","Data":"24653f53e09e5b9c2e6c2dea1865860c81597e8cd3959e1a412d55315a452b3f"} Apr 17 16:20:17.424818 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.424794 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerStarted","Data":"81935a9864d56f8dca8fde1be7abe601b517bb6d2c0f893bb30c2b7b3ac94e81"} Apr 17 16:20:17.426104 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.426079 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" event={"ID":"ebac6b61-ba93-4d25-b409-4b535683b067","Type":"ContainerStarted","Data":"642775d389cf716393e2d76acae4f49b3e06f1ff7779139a6e0be35772f8e1e0"} Apr 17 16:20:17.426310 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.426264 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:17.427228 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.427202 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhfkp" event={"ID":"bace55ce-7fc7-4b76-82f8-0f8250ee98a7","Type":"ContainerStarted","Data":"f0e6a13001197068713466cb13a5d5b13678a5eed4b1d88e59fd99f8d8c4e69f"} Apr 17 16:20:17.428257 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.428222 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:20:17.428559 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.428491 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" event={"ID":"3406f4a9-353e-4f09-9d82-56ea350d37e8","Type":"ContainerStarted","Data":"b258051f313b75993f6dcc1ddd95b0dd4ac500a876b1cd2aaacfa6f0fed1cab0"} Apr 17 16:20:17.460512 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.460462 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" podStartSLOduration=1.557469184 podStartE2EDuration="9.460448818s" podCreationTimestamp="2026-04-17 16:20:08 +0000 UTC" firstStartedPulling="2026-04-17 16:20:09.251118568 +0000 UTC m=+39.625031488" lastFinishedPulling="2026-04-17 16:20:17.154098203 +0000 UTC m=+47.528011122" observedRunningTime="2026-04-17 16:20:17.460275499 +0000 UTC m=+47.834188442" watchObservedRunningTime="2026-04-17 16:20:17.460448818 +0000 UTC m=+47.834361757" Apr 17 16:20:17.472900 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:17.472853 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" podStartSLOduration=1.571054481 podStartE2EDuration="9.472839162s" podCreationTimestamp="2026-04-17 16:20:08 +0000 UTC" firstStartedPulling="2026-04-17 16:20:09.25251885 +0000 UTC m=+39.626431786" lastFinishedPulling="2026-04-17 16:20:17.154303546 +0000 UTC m=+47.528216467" observedRunningTime="2026-04-17 16:20:17.472634616 +0000 UTC m=+47.846547559" watchObservedRunningTime="2026-04-17 16:20:17.472839162 +0000 UTC m=+47.846752103" Apr 17 16:20:18.433593 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:18.433548 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="24653f53e09e5b9c2e6c2dea1865860c81597e8cd3959e1a412d55315a452b3f" exitCode=0 Apr 17 16:20:18.434112 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:18.433672 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"24653f53e09e5b9c2e6c2dea1865860c81597e8cd3959e1a412d55315a452b3f"} Apr 17 16:20:18.825406 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:18.825369 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:18.825611 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:18.825433 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:18.825611 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.825553 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:18.825611 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.825593 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:18.825774 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.825609 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:18.825774 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.825612 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:20:34.825597875 +0000 UTC m=+65.199510794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:18.825774 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.825676 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:34.825660579 +0000 UTC m=+65.199573507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:18.926068 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:18.926026 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:18.926247 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.926169 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:18.926247 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:18.926237 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:20:34.926219217 +0000 UTC m=+65.300132145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:19.438939 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:19.438903 2584 generic.go:358] "Generic (PLEG): container finished" podID="c20048a9-0ed2-477d-9b33-25dc727aeda5" containerID="3096425516cfe6c1061d79f92bd2d0d9e8473ff1df529fe052ea59b048c440b3" exitCode=0 Apr 17 16:20:19.439390 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:19.438987 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerDied","Data":"3096425516cfe6c1061d79f92bd2d0d9e8473ff1df529fe052ea59b048c440b3"} Apr 17 16:20:20.446903 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:20.446866 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6896" event={"ID":"c20048a9-0ed2-477d-9b33-25dc727aeda5","Type":"ContainerStarted","Data":"2d0749030e79face35e17564cee8c31e81408ddd7983c3fb6b30f5b90e59ce9e"} Apr 17 16:20:20.469290 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:20.469239 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p6896" podStartSLOduration=4.799199153 podStartE2EDuration="50.469226048s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:19:31.483565816 +0000 UTC m=+1.857478739" lastFinishedPulling="2026-04-17 16:20:17.153592696 +0000 UTC m=+47.527505634" observedRunningTime="2026-04-17 16:20:20.467668878 +0000 UTC m=+50.841581819" watchObservedRunningTime="2026-04-17 16:20:20.469226048 +0000 UTC m=+50.843138989" Apr 17 16:20:21.451237 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:21.451185 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerStarted","Data":"5a6285fed33601e33db5ced03a815d8bb96a28879b2d0cd72a69cc924d8189e0"} Apr 17 16:20:21.451721 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:21.451247 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerStarted","Data":"e7e163974224e37305658ccfc7f5b9ec01c3fcb678eca485877544c5c303cd16"} Apr 17 16:20:21.469713 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:21.469662 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" podStartSLOduration=2.192829201 podStartE2EDuration="13.469649242s" podCreationTimestamp="2026-04-17 16:20:08 +0000 UTC" firstStartedPulling="2026-04-17 16:20:09.251011148 +0000 UTC m=+39.624924068" lastFinishedPulling="2026-04-17 16:20:20.527831175 +0000 UTC m=+50.901744109" observedRunningTime="2026-04-17 16:20:21.46832154 +0000 UTC m=+51.842234504" watchObservedRunningTime="2026-04-17 16:20:21.469649242 +0000 UTC m=+51.843562190" Apr 17 16:20:22.454932 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:22.454889 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhfkp" event={"ID":"bace55ce-7fc7-4b76-82f8-0f8250ee98a7","Type":"ContainerStarted","Data":"488a8398e8a3d068f666c27d67f2f7bee91f344569c42a5b08e826f2b57b0598"} Apr 17 16:20:22.469484 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:22.469442 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qhfkp" podStartSLOduration=34.458648971 podStartE2EDuration="39.469428785s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:20:17.28795977 +0000 UTC m=+47.661872697" lastFinishedPulling="2026-04-17 16:20:22.298739592 +0000 UTC m=+52.672652511" observedRunningTime="2026-04-17 16:20:22.469257536 +0000 UTC m=+52.843170480" watchObservedRunningTime="2026-04-17 16:20:22.469428785 +0000 UTC m=+52.843341726" Apr 17 16:20:28.385396 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:28.385356 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qvmb" Apr 17 16:20:34.847864 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:34.847828 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:34.847874 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.847968 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.847983 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.847993 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.848031 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:06.848016339 +0000 UTC m=+97.221929259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:20:34.848345 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.848046 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:06.848039808 +0000 UTC m=+97.221952728 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:20:34.949110 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:34.949079 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:20:34.949225 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:34.949181 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:20:34.949319 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.949304 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:34.949382 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.949370 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:21:06.949350915 +0000 UTC m=+97.323263857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:20:34.951769 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:34.951750 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:20:34.959809 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.959795 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:20:34.959887 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:20:34.959845 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:38.959830185 +0000 UTC m=+129.333743106 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : secret "metrics-daemon-secret" not found Apr 17 16:20:35.049810 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.049789 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:35.052527 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.052489 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:20:35.063666 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.063647 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:20:35.074179 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.074161 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcnpj\" (UniqueName: \"kubernetes.io/projected/a750395e-b985-4d27-bd7c-fc1fdea00304-kube-api-access-tcnpj\") pod \"network-check-target-m64jw\" (UID: \"a750395e-b985-4d27-bd7c-fc1fdea00304\") " pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:35.296508 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.296458 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:20:35.304666 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.304646 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:35.420122 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.420096 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m64jw"] Apr 17 16:20:35.423228 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:20:35.423193 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda750395e_b985_4d27_bd7c_fc1fdea00304.slice/crio-f30087fe4ed4bb54b533e5088c5c57f373f123b42ed36b72cf49a7a91ddcbf5a WatchSource:0}: Error finding container f30087fe4ed4bb54b533e5088c5c57f373f123b42ed36b72cf49a7a91ddcbf5a: Status 404 returned error can't find the container with id f30087fe4ed4bb54b533e5088c5c57f373f123b42ed36b72cf49a7a91ddcbf5a Apr 17 16:20:35.491668 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:35.491637 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m64jw" event={"ID":"a750395e-b985-4d27-bd7c-fc1fdea00304","Type":"ContainerStarted","Data":"f30087fe4ed4bb54b533e5088c5c57f373f123b42ed36b72cf49a7a91ddcbf5a"} Apr 17 16:20:38.500786 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:38.500760 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m64jw" event={"ID":"a750395e-b985-4d27-bd7c-fc1fdea00304","Type":"ContainerStarted","Data":"563e4c0ed546e0aa374b3088dcdc88bfa067484beeedc0a61fe02b8e891159bd"} Apr 17 16:20:38.501088 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:38.500886 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:20:38.514855 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:20:38.514789 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m64jw" podStartSLOduration=65.522385977 podStartE2EDuration="1m8.5147766s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:20:35.425622197 +0000 UTC m=+65.799535117" lastFinishedPulling="2026-04-17 16:20:38.418012817 +0000 UTC m=+68.791925740" observedRunningTime="2026-04-17 16:20:38.514591714 +0000 UTC m=+68.888504656" watchObservedRunningTime="2026-04-17 16:20:38.5147766 +0000 UTC m=+68.888689542" Apr 17 16:21:06.880674 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:06.880636 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:06.880686 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.880780 2584 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.880781 2584 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.880791 2584 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64df8799cb-6mcdx: secret "image-registry-tls" not found Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.880851 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls podName:02673158-23e7-43fc-8144-c9a194260603 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:10.880837539 +0000 UTC m=+161.254750459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls") pod "image-registry-64df8799cb-6mcdx" (UID: "02673158-23e7-43fc-8144-c9a194260603") : secret "image-registry-tls" not found Apr 17 16:21:06.881091 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.880864 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert podName:93428031-c4fe-4f1f-a088-b038195cf17e nodeName:}" failed. No retries permitted until 2026-04-17 16:22:10.880857171 +0000 UTC m=+161.254770091 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert") pod "ingress-canary-5sm7s" (UID: "93428031-c4fe-4f1f-a088-b038195cf17e") : secret "canary-serving-cert" not found Apr 17 16:21:06.981193 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:06.981161 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:21:06.981325 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.981280 2584 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:21:06.981363 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:06.981332 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls podName:934467f3-270b-4b90-b4e8-331914b57c8d nodeName:}" failed. No retries permitted until 2026-04-17 16:22:10.981319665 +0000 UTC m=+161.355232585 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls") pod "dns-default-ddhcx" (UID: "934467f3-270b-4b90-b4e8-331914b57c8d") : secret "dns-default-metrics-tls" not found Apr 17 16:21:09.505904 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:09.505874 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m64jw" Apr 17 16:21:36.517556 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:36.517526 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-976rk_b2b1e0e2-9eb2-4e4f-9027-c81e854a984c/dns-node-resolver/0.log" Apr 17 16:21:37.117440 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:37.117412 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cdxj4_4dd52096-bca5-4442-852e-5f41d1bb9827/node-ca/0.log" Apr 17 16:21:39.023545 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:21:39.023513 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:21:39.023898 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:39.023627 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:21:39.023898 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:21:39.023696 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs podName:dcbdb9dc-df9a-4c0b-850e-370061051a08 nodeName:}" failed. No retries permitted until 2026-04-17 16:23:41.023681191 +0000 UTC m=+251.397594112 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs") pod "network-metrics-daemon-cgrms" (UID: "dcbdb9dc-df9a-4c0b-850e-370061051a08") : secret "metrics-daemon-secret" not found Apr 17 16:22:05.997961 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:05.997919 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" podUID="02673158-23e7-43fc-8144-c9a194260603" Apr 17 16:22:06.006053 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:06.006019 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5sm7s" podUID="93428031-c4fe-4f1f-a088-b038195cf17e" Apr 17 16:22:06.086007 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.085981 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2dbmm"] Apr 17 16:22:06.087801 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.087787 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.090366 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.090343 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:22:06.091408 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.091368 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8zcsn\"" Apr 17 16:22:06.091651 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.091377 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:22:06.091651 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.091392 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:22:06.091651 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.091461 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:22:06.101540 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.101519 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2dbmm"] Apr 17 16:22:06.107759 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:06.107733 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ddhcx" podUID="934467f3-270b-4b90-b4e8-331914b57c8d" Apr 17 16:22:06.215365 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.215338 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh7ft\" (UniqueName: \"kubernetes.io/projected/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-api-access-bh7ft\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.215365 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.215378 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2f0b1797-1af9-4631-be74-97e2db42f8ec-data-volume\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.215584 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.215416 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2f0b1797-1af9-4631-be74-97e2db42f8ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.215584 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.215435 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.215584 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.215514 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2f0b1797-1af9-4631-be74-97e2db42f8ec-crio-socket\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316407 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316375 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2f0b1797-1af9-4631-be74-97e2db42f8ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316407 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316406 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316435 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2f0b1797-1af9-4631-be74-97e2db42f8ec-crio-socket\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316519 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh7ft\" (UniqueName: \"kubernetes.io/projected/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-api-access-bh7ft\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316556 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2f0b1797-1af9-4631-be74-97e2db42f8ec-data-volume\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316609 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316584 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2f0b1797-1af9-4631-be74-97e2db42f8ec-crio-socket\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316914 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316897 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2f0b1797-1af9-4631-be74-97e2db42f8ec-data-volume\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.316968 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.316909 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.319396 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.319381 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2f0b1797-1af9-4631-be74-97e2db42f8ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.324887 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.324867 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh7ft\" (UniqueName: \"kubernetes.io/projected/2f0b1797-1af9-4631-be74-97e2db42f8ec-kube-api-access-bh7ft\") pod \"insights-runtime-extractor-2dbmm\" (UID: \"2f0b1797-1af9-4631-be74-97e2db42f8ec\") " pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.396271 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.396247 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2dbmm" Apr 17 16:22:06.514734 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.514706 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2dbmm"] Apr 17 16:22:06.519041 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:22:06.519002 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0b1797_1af9_4631_be74_97e2db42f8ec.slice/crio-e7ccf8611f59e10c0202550bad07a002b4eb997811fe0cb78252fac754383e2f WatchSource:0}: Error finding container e7ccf8611f59e10c0202550bad07a002b4eb997811fe0cb78252fac754383e2f: Status 404 returned error can't find the container with id e7ccf8611f59e10c0202550bad07a002b4eb997811fe0cb78252fac754383e2f Apr 17 16:22:06.704802 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.704775 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:06.704802 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.704784 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dbmm" event={"ID":"2f0b1797-1af9-4631-be74-97e2db42f8ec","Type":"ContainerStarted","Data":"8009a5fad5c6f86e9b8a10934a1198a77f582089e899ee13710f70f6d225ba07"} Apr 17 16:22:06.705018 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.704814 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:22:06.705018 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.704827 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dbmm" event={"ID":"2f0b1797-1af9-4631-be74-97e2db42f8ec","Type":"ContainerStarted","Data":"e7ccf8611f59e10c0202550bad07a002b4eb997811fe0cb78252fac754383e2f"} Apr 17 16:22:06.705018 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:06.704801 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:07.198121 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:07.198082 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cgrms" podUID="dcbdb9dc-df9a-4c0b-850e-370061051a08" Apr 17 16:22:07.708960 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:07.708922 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dbmm" event={"ID":"2f0b1797-1af9-4631-be74-97e2db42f8ec","Type":"ContainerStarted","Data":"572795790b126fcadbcf4ea1950e83fbc0092c5c6766f6d812b0ec5503486237"} Apr 17 16:22:09.714948 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:09.714870 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dbmm" event={"ID":"2f0b1797-1af9-4631-be74-97e2db42f8ec","Type":"ContainerStarted","Data":"bca6b74946363b5a569d6e6971a7fca0c68785564ab5c7a3016f7c48b83867db"} Apr 17 16:22:09.731689 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:09.731645 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2dbmm" podStartSLOduration=0.984454126 podStartE2EDuration="3.731632359s" podCreationTimestamp="2026-04-17 16:22:06 +0000 UTC" firstStartedPulling="2026-04-17 16:22:06.565871048 +0000 UTC m=+156.939783968" lastFinishedPulling="2026-04-17 16:22:09.313049281 +0000 UTC m=+159.686962201" observedRunningTime="2026-04-17 16:22:09.731060617 +0000 UTC m=+160.104973559" watchObservedRunningTime="2026-04-17 16:22:09.731632359 +0000 UTC m=+160.105545301" Apr 17 16:22:10.955591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:10.955549 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:10.955972 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:10.955610 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:22:10.958067 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:10.958039 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"image-registry-64df8799cb-6mcdx\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:10.958177 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:10.958041 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93428031-c4fe-4f1f-a088-b038195cf17e-cert\") pod \"ingress-canary-5sm7s\" (UID: \"93428031-c4fe-4f1f-a088-b038195cf17e\") " pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:22:11.056485 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.056446 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:11.058594 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.058575 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/934467f3-270b-4b90-b4e8-331914b57c8d-metrics-tls\") pod \"dns-default-ddhcx\" (UID: \"934467f3-270b-4b90-b4e8-331914b57c8d\") " pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:11.209358 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.209283 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:22:11.209358 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.209305 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5mbwd\"" Apr 17 16:22:11.209680 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.209283 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:22:11.216338 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.216321 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:11.216420 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.216342 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:11.216472 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.216322 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5sm7s" Apr 17 16:22:11.349874 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.349847 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5sm7s"] Apr 17 16:22:11.352659 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:22:11.352611 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93428031_c4fe_4f1f_a088_b038195cf17e.slice/crio-c3984dd363f23e5c3373363c25052d064eeaa095b850823d2ce22695bd4e42f3 WatchSource:0}: Error finding container c3984dd363f23e5c3373363c25052d064eeaa095b850823d2ce22695bd4e42f3: Status 404 returned error can't find the container with id c3984dd363f23e5c3373363c25052d064eeaa095b850823d2ce22695bd4e42f3 Apr 17 16:22:11.383030 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.383008 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:22:11.385537 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:22:11.385510 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02673158_23e7_43fc_8144_c9a194260603.slice/crio-6027ba22468d43b4d5e98a0bef667f0682c239733b3ae187bfd86de316fad077 WatchSource:0}: Error finding container 6027ba22468d43b4d5e98a0bef667f0682c239733b3ae187bfd86de316fad077: Status 404 returned error can't find the container with id 6027ba22468d43b4d5e98a0bef667f0682c239733b3ae187bfd86de316fad077 Apr 17 16:22:11.588472 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.588438 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ddhcx"] Apr 17 16:22:11.591247 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:22:11.591219 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934467f3_270b_4b90_b4e8_331914b57c8d.slice/crio-64028f87c28c10238fb66021e51391714eaec0880d87376ad372f83e041ed920 WatchSource:0}: Error finding container 64028f87c28c10238fb66021e51391714eaec0880d87376ad372f83e041ed920: Status 404 returned error can't find the container with id 64028f87c28c10238fb66021e51391714eaec0880d87376ad372f83e041ed920 Apr 17 16:22:11.721873 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.721836 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" event={"ID":"02673158-23e7-43fc-8144-c9a194260603","Type":"ContainerStarted","Data":"6f7ecf2b4d35b4f392eddaa535602c8c5a96027c6d09812fe7e0731cdcf8438d"} Apr 17 16:22:11.721873 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.721876 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" event={"ID":"02673158-23e7-43fc-8144-c9a194260603","Type":"ContainerStarted","Data":"6027ba22468d43b4d5e98a0bef667f0682c239733b3ae187bfd86de316fad077"} Apr 17 16:22:11.722094 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.721970 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:11.723007 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.722982 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddhcx" event={"ID":"934467f3-270b-4b90-b4e8-331914b57c8d","Type":"ContainerStarted","Data":"64028f87c28c10238fb66021e51391714eaec0880d87376ad372f83e041ed920"} Apr 17 16:22:11.723903 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.723884 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5sm7s" event={"ID":"93428031-c4fe-4f1f-a088-b038195cf17e","Type":"ContainerStarted","Data":"c3984dd363f23e5c3373363c25052d064eeaa095b850823d2ce22695bd4e42f3"} Apr 17 16:22:11.741745 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:11.741693 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" podStartSLOduration=161.741676287 podStartE2EDuration="2m41.741676287s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:22:11.74055974 +0000 UTC m=+162.114472684" watchObservedRunningTime="2026-04-17 16:22:11.741676287 +0000 UTC m=+162.115589230" Apr 17 16:22:13.732466 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.732425 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddhcx" event={"ID":"934467f3-270b-4b90-b4e8-331914b57c8d","Type":"ContainerStarted","Data":"964803a3361cd279aa737b9a55bf9d7ee99a6fa7c697ac4ed93a1f2f41c60cf9"} Apr 17 16:22:13.732466 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.732468 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddhcx" event={"ID":"934467f3-270b-4b90-b4e8-331914b57c8d","Type":"ContainerStarted","Data":"93cb46798ef6d0429007df01bf2afbdb4f3601320fed6995b5b0c9e74baa5499"} Apr 17 16:22:13.732991 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.732550 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:13.734171 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.734144 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5sm7s" event={"ID":"93428031-c4fe-4f1f-a088-b038195cf17e","Type":"ContainerStarted","Data":"9b3c50e9ddd8c15db7231af3be2ed48cf6101a3233029f6a307af7d54baaddc4"} Apr 17 16:22:13.748531 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.748465 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ddhcx" podStartSLOduration=128.862431962 podStartE2EDuration="2m10.748451254s" podCreationTimestamp="2026-04-17 16:20:03 +0000 UTC" firstStartedPulling="2026-04-17 16:22:11.593135011 +0000 UTC m=+161.967047931" lastFinishedPulling="2026-04-17 16:22:13.479154303 +0000 UTC m=+163.853067223" observedRunningTime="2026-04-17 16:22:13.748016165 +0000 UTC m=+164.121929109" watchObservedRunningTime="2026-04-17 16:22:13.748451254 +0000 UTC m=+164.122364195" Apr 17 16:22:13.763827 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:13.763789 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5sm7s" podStartSLOduration=129.678618408 podStartE2EDuration="2m11.763777229s" podCreationTimestamp="2026-04-17 16:20:02 +0000 UTC" firstStartedPulling="2026-04-17 16:22:11.354491833 +0000 UTC m=+161.728404754" lastFinishedPulling="2026-04-17 16:22:13.439650655 +0000 UTC m=+163.813563575" observedRunningTime="2026-04-17 16:22:13.763303104 +0000 UTC m=+164.137216045" watchObservedRunningTime="2026-04-17 16:22:13.763777229 +0000 UTC m=+164.137690171" Apr 17 16:22:14.929868 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.929836 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n489l"] Apr 17 16:22:14.932977 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.932956 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.935592 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.935569 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:22:14.935739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.935569 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:22:14.935739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.935621 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:22:14.935739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.935723 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:22:14.935901 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.935842 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:22:14.936789 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.936773 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z7htb\"" Apr 17 16:22:14.937672 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.937656 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:22:14.991197 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991168 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-accelerators-collector-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991325 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991208 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-textfile\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991325 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991241 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-metrics-client-ca\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991325 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991292 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-wtmp\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991326 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-sys\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991355 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991393 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991430 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zj6d\" (UniqueName: \"kubernetes.io/projected/d4147357-1eb5-4032-b42c-5cc65a071498-kube-api-access-7zj6d\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:14.991477 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:14.991455 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-root\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092118 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092074 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-metrics-client-ca\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092250 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092141 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-wtmp\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092250 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092176 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-sys\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092250 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092204 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092250 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092234 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092260 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zj6d\" (UniqueName: \"kubernetes.io/projected/d4147357-1eb5-4032-b42c-5cc65a071498-kube-api-access-7zj6d\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092281 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-root\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092300 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-sys\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092334 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-accelerators-collector-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092334 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-wtmp\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092362 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d4147357-1eb5-4032-b42c-5cc65a071498-root\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:15.092370 2584 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092365 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-textfile\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092442 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:22:15.092433 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls podName:d4147357-1eb5-4032-b42c-5cc65a071498 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:15.59241451 +0000 UTC m=+165.966327431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls") pod "node-exporter-n489l" (UID: "d4147357-1eb5-4032-b42c-5cc65a071498") : secret "node-exporter-tls" not found Apr 17 16:22:15.092847 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092661 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-textfile\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092897 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092874 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-accelerators-collector-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.092933 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.092904 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4147357-1eb5-4032-b42c-5cc65a071498-metrics-client-ca\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.094979 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.094952 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.100932 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.100900 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zj6d\" (UniqueName: \"kubernetes.io/projected/d4147357-1eb5-4032-b42c-5cc65a071498-kube-api-access-7zj6d\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.597467 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.597432 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.599833 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.599808 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d4147357-1eb5-4032-b42c-5cc65a071498-node-exporter-tls\") pod \"node-exporter-n489l\" (UID: \"d4147357-1eb5-4032-b42c-5cc65a071498\") " pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.841845 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:15.841811 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n489l" Apr 17 16:22:15.849309 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:22:15.849254 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4147357_1eb5_4032_b42c_5cc65a071498.slice/crio-a0fbfe045a23932a836f49e8f21f25ade9f4763163a37ccaa6bb8ce604f9989b WatchSource:0}: Error finding container a0fbfe045a23932a836f49e8f21f25ade9f4763163a37ccaa6bb8ce604f9989b: Status 404 returned error can't find the container with id a0fbfe045a23932a836f49e8f21f25ade9f4763163a37ccaa6bb8ce604f9989b Apr 17 16:22:16.742277 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:16.742242 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n489l" event={"ID":"d4147357-1eb5-4032-b42c-5cc65a071498","Type":"ContainerStarted","Data":"992d9dbaa7988af90e85e6700abaf81d887cfda97218e1b972beda0d1c7ad44e"} Apr 17 16:22:16.742643 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:16.742285 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n489l" event={"ID":"d4147357-1eb5-4032-b42c-5cc65a071498","Type":"ContainerStarted","Data":"a0fbfe045a23932a836f49e8f21f25ade9f4763163a37ccaa6bb8ce604f9989b"} Apr 17 16:22:17.426854 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.426800 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" podUID="ebac6b61-ba93-4d25-b409-4b535683b067" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 17 16:22:17.746443 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.746361 2584 generic.go:358] "Generic (PLEG): container finished" podID="3406f4a9-353e-4f09-9d82-56ea350d37e8" containerID="b258051f313b75993f6dcc1ddd95b0dd4ac500a876b1cd2aaacfa6f0fed1cab0" exitCode=255 Apr 17 16:22:17.746859 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.746436 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" event={"ID":"3406f4a9-353e-4f09-9d82-56ea350d37e8","Type":"ContainerDied","Data":"b258051f313b75993f6dcc1ddd95b0dd4ac500a876b1cd2aaacfa6f0fed1cab0"} Apr 17 16:22:17.746859 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.746818 2584 scope.go:117] "RemoveContainer" containerID="b258051f313b75993f6dcc1ddd95b0dd4ac500a876b1cd2aaacfa6f0fed1cab0" Apr 17 16:22:17.747821 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.747803 2584 generic.go:358] "Generic (PLEG): container finished" podID="d4147357-1eb5-4032-b42c-5cc65a071498" containerID="992d9dbaa7988af90e85e6700abaf81d887cfda97218e1b972beda0d1c7ad44e" exitCode=0 Apr 17 16:22:17.747905 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.747876 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n489l" event={"ID":"d4147357-1eb5-4032-b42c-5cc65a071498","Type":"ContainerDied","Data":"992d9dbaa7988af90e85e6700abaf81d887cfda97218e1b972beda0d1c7ad44e"} Apr 17 16:22:17.749345 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.749309 2584 generic.go:358] "Generic (PLEG): container finished" podID="ebac6b61-ba93-4d25-b409-4b535683b067" containerID="642775d389cf716393e2d76acae4f49b3e06f1ff7779139a6e0be35772f8e1e0" exitCode=1 Apr 17 16:22:17.749432 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.749359 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" event={"ID":"ebac6b61-ba93-4d25-b409-4b535683b067","Type":"ContainerDied","Data":"642775d389cf716393e2d76acae4f49b3e06f1ff7779139a6e0be35772f8e1e0"} Apr 17 16:22:17.749696 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:17.749681 2584 scope.go:117] "RemoveContainer" containerID="642775d389cf716393e2d76acae4f49b3e06f1ff7779139a6e0be35772f8e1e0" Apr 17 16:22:18.184767 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.184737 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:22:18.753217 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.753177 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5694b46cd9-dpcf9" event={"ID":"3406f4a9-353e-4f09-9d82-56ea350d37e8","Type":"ContainerStarted","Data":"0e5220e868fe9a3bfc02ccb4c9800161ba3ee7a88855eea1498535c547b02f7c"} Apr 17 16:22:18.754863 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.754838 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n489l" event={"ID":"d4147357-1eb5-4032-b42c-5cc65a071498","Type":"ContainerStarted","Data":"1a9d9132bdcdc328dca3d7b92750c435bfb5655f0161383957af370201d20a0e"} Apr 17 16:22:18.754992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.754868 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n489l" event={"ID":"d4147357-1eb5-4032-b42c-5cc65a071498","Type":"ContainerStarted","Data":"ef6372250493b2619168181c0be48085b5e9c3b9d75ed57da64d5632f82818ce"} Apr 17 16:22:18.756189 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.756173 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" event={"ID":"ebac6b61-ba93-4d25-b409-4b535683b067","Type":"ContainerStarted","Data":"7032c211694f2c90bd841d4b79e44c797e66e64c1e1ea51c0d3acd637eb67b6e"} Apr 17 16:22:18.756413 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.756400 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:22:18.756966 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.756949 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dc4f7cb58-54dh8" Apr 17 16:22:18.792984 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:18.792940 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n489l" podStartSLOduration=4.030444254 podStartE2EDuration="4.792929195s" podCreationTimestamp="2026-04-17 16:22:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:15.850948961 +0000 UTC m=+166.224861884" lastFinishedPulling="2026-04-17 16:22:16.613433905 +0000 UTC m=+166.987346825" observedRunningTime="2026-04-17 16:22:18.791863683 +0000 UTC m=+169.165776648" watchObservedRunningTime="2026-04-17 16:22:18.792929195 +0000 UTC m=+169.166842183" Apr 17 16:22:23.738621 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:23.738593 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ddhcx" Apr 17 16:22:28.625335 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:28.625294 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:22:28.629695 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:28.629672 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:39.046800 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:39.046763 2584 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" podUID="340231c1-3824-44e5-baf9-ae36a08a198b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:22:49.046550 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:49.046491 2584 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" podUID="340231c1-3824-44e5-baf9-ae36a08a198b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:22:53.644348 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.644289 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" podUID="02673158-23e7-43fc-8144-c9a194260603" containerName="registry" containerID="cri-o://6f7ecf2b4d35b4f392eddaa535602c8c5a96027c6d09812fe7e0731cdcf8438d" gracePeriod=30 Apr 17 16:22:53.847971 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.847923 2584 generic.go:358] "Generic (PLEG): container finished" podID="02673158-23e7-43fc-8144-c9a194260603" containerID="6f7ecf2b4d35b4f392eddaa535602c8c5a96027c6d09812fe7e0731cdcf8438d" exitCode=0 Apr 17 16:22:53.848130 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.848023 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" event={"ID":"02673158-23e7-43fc-8144-c9a194260603","Type":"ContainerDied","Data":"6f7ecf2b4d35b4f392eddaa535602c8c5a96027c6d09812fe7e0731cdcf8438d"} Apr 17 16:22:53.887378 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.887356 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:53.969228 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969162 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969228 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969200 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969228 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969221 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969237 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnh4l\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969330 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969412 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969440 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969476 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969466 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration\") pod \"02673158-23e7-43fc-8144-c9a194260603\" (UID: \"02673158-23e7-43fc-8144-c9a194260603\") " Apr 17 16:22:53.969763 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.969735 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:22:53.970149 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.970122 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:22:53.972027 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.971998 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l" (OuterVolumeSpecName: "kube-api-access-rnh4l") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "kube-api-access-rnh4l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:53.972175 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.972140 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:53.972340 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.972311 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:22:53.972388 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.972328 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:53.972388 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.972372 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:22:53.977563 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:53.977540 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "02673158-23e7-43fc-8144-c9a194260603" (UID: "02673158-23e7-43fc-8144-c9a194260603"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:22:54.070643 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070619 2584 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-registry-tls\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070643 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070640 2584 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02673158-23e7-43fc-8144-c9a194260603-ca-trust-extracted\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070651 2584 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-image-registry-private-configuration\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070661 2584 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-bound-sa-token\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070669 2584 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-registry-certificates\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070678 2584 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02673158-23e7-43fc-8144-c9a194260603-trusted-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070687 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnh4l\" (UniqueName: \"kubernetes.io/projected/02673158-23e7-43fc-8144-c9a194260603-kube-api-access-rnh4l\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.070787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.070696 2584 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02673158-23e7-43fc-8144-c9a194260603-installation-pull-secrets\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:22:54.852341 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.852308 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" event={"ID":"02673158-23e7-43fc-8144-c9a194260603","Type":"ContainerDied","Data":"6027ba22468d43b4d5e98a0bef667f0682c239733b3ae187bfd86de316fad077"} Apr 17 16:22:54.852341 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.852325 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64df8799cb-6mcdx" Apr 17 16:22:54.852865 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.852359 2584 scope.go:117] "RemoveContainer" containerID="6f7ecf2b4d35b4f392eddaa535602c8c5a96027c6d09812fe7e0731cdcf8438d" Apr 17 16:22:54.871754 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.871728 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:22:54.876675 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:54.876656 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64df8799cb-6mcdx"] Apr 17 16:22:56.188440 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:56.188402 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02673158-23e7-43fc-8144-c9a194260603" path="/var/lib/kubelet/pods/02673158-23e7-43fc-8144-c9a194260603/volumes" Apr 17 16:22:59.046433 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.046398 2584 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" podUID="340231c1-3824-44e5-baf9-ae36a08a198b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:22:59.046816 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.046474 2584 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" Apr 17 16:22:59.046935 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.046904 2584 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"5a6285fed33601e33db5ced03a815d8bb96a28879b2d0cd72a69cc924d8189e0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 16:22:59.046971 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.046954 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" podUID="340231c1-3824-44e5-baf9-ae36a08a198b" containerName="service-proxy" containerID="cri-o://5a6285fed33601e33db5ced03a815d8bb96a28879b2d0cd72a69cc924d8189e0" gracePeriod=30 Apr 17 16:22:59.408569 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.408474 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5sm7s_93428031-c4fe-4f1f-a088-b038195cf17e/serve-healthcheck-canary/0.log" Apr 17 16:22:59.867199 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.867167 2584 generic.go:358] "Generic (PLEG): container finished" podID="340231c1-3824-44e5-baf9-ae36a08a198b" containerID="5a6285fed33601e33db5ced03a815d8bb96a28879b2d0cd72a69cc924d8189e0" exitCode=2 Apr 17 16:22:59.867359 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.867205 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerDied","Data":"5a6285fed33601e33db5ced03a815d8bb96a28879b2d0cd72a69cc924d8189e0"} Apr 17 16:22:59.867359 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:22:59.867226 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bcc89b8f-zclrn" event={"ID":"340231c1-3824-44e5-baf9-ae36a08a198b","Type":"ContainerStarted","Data":"497b298e69f06db36b8eefa257d0738ccbcc4a7c231bfb5c761152831be77101"} Apr 17 16:23:41.092567 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.092523 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:23:41.094869 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.094851 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcbdb9dc-df9a-4c0b-850e-370061051a08-metrics-certs\") pod \"network-metrics-daemon-cgrms\" (UID: \"dcbdb9dc-df9a-4c0b-850e-370061051a08\") " pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:23:41.287667 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.287635 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:23:41.295635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.295610 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgrms" Apr 17 16:23:41.423536 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.423447 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cgrms"] Apr 17 16:23:41.426063 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:23:41.426024 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcbdb9dc_df9a_4c0b_850e_370061051a08.slice/crio-531e9dca02c3d957bc6e3f51d2fcde033cd2a7f48b854d95512789a7964a81cd WatchSource:0}: Error finding container 531e9dca02c3d957bc6e3f51d2fcde033cd2a7f48b854d95512789a7964a81cd: Status 404 returned error can't find the container with id 531e9dca02c3d957bc6e3f51d2fcde033cd2a7f48b854d95512789a7964a81cd Apr 17 16:23:41.968654 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:41.968616 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgrms" event={"ID":"dcbdb9dc-df9a-4c0b-850e-370061051a08","Type":"ContainerStarted","Data":"531e9dca02c3d957bc6e3f51d2fcde033cd2a7f48b854d95512789a7964a81cd"} Apr 17 16:23:42.972277 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:42.972238 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgrms" event={"ID":"dcbdb9dc-df9a-4c0b-850e-370061051a08","Type":"ContainerStarted","Data":"af7c9ca5fcf309837810edc2e8182fcf89782570a46744dcaf7c02a02b06b180"} Apr 17 16:23:42.972660 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:42.972281 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgrms" event={"ID":"dcbdb9dc-df9a-4c0b-850e-370061051a08","Type":"ContainerStarted","Data":"f306f54d50bea36b91da33ce38452de84c55106c9eff50b019aea0d20a174384"} Apr 17 16:23:42.990160 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:23:42.990110 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cgrms" podStartSLOduration=252.0513938 podStartE2EDuration="4m12.99009758s" podCreationTimestamp="2026-04-17 16:19:30 +0000 UTC" firstStartedPulling="2026-04-17 16:23:41.427633963 +0000 UTC m=+251.801546884" lastFinishedPulling="2026-04-17 16:23:42.366337743 +0000 UTC m=+252.740250664" observedRunningTime="2026-04-17 16:23:42.988699584 +0000 UTC m=+253.362612528" watchObservedRunningTime="2026-04-17 16:23:42.99009758 +0000 UTC m=+253.364010522" Apr 17 16:24:30.061929 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:24:30.061903 2584 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:25:34.340614 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.340581 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-w5fn9"] Apr 17 16:25:34.341029 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.340791 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02673158-23e7-43fc-8144-c9a194260603" containerName="registry" Apr 17 16:25:34.341029 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.340802 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="02673158-23e7-43fc-8144-c9a194260603" containerName="registry" Apr 17 16:25:34.341029 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.340841 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="02673158-23e7-43fc-8144-c9a194260603" containerName="registry" Apr 17 16:25:34.343400 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.343380 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.346025 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.345981 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:25:34.346125 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.346030 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-7k26z\"" Apr 17 16:25:34.347237 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.347223 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:25:34.350591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.350559 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w5fn9"] Apr 17 16:25:34.457535 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.457513 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-bound-sa-token\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.457655 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.457545 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2q2\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-kube-api-access-bc2q2\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.558613 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.558577 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-bound-sa-token\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.558613 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.558615 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2q2\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-kube-api-access-bc2q2\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.566224 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.566196 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-bound-sa-token\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.566333 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.566256 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2q2\" (UniqueName: \"kubernetes.io/projected/a6b0e4d4-7992-492a-8d7f-01c4fba79c1f-kube-api-access-bc2q2\") pod \"cert-manager-759f64656b-w5fn9\" (UID: \"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f\") " pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.652243 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.652177 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w5fn9" Apr 17 16:25:34.766355 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.766327 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w5fn9"] Apr 17 16:25:34.769197 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:25:34.769168 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6b0e4d4_7992_492a_8d7f_01c4fba79c1f.slice/crio-de2c1af1176b468a7b475ae86e944a2d7f18607085eeaf51f97bd7d0d917d288 WatchSource:0}: Error finding container de2c1af1176b468a7b475ae86e944a2d7f18607085eeaf51f97bd7d0d917d288: Status 404 returned error can't find the container with id de2c1af1176b468a7b475ae86e944a2d7f18607085eeaf51f97bd7d0d917d288 Apr 17 16:25:34.771076 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:34.771059 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:25:35.251352 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:35.251316 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w5fn9" event={"ID":"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f","Type":"ContainerStarted","Data":"de2c1af1176b468a7b475ae86e944a2d7f18607085eeaf51f97bd7d0d917d288"} Apr 17 16:25:38.261600 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:38.261557 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w5fn9" event={"ID":"a6b0e4d4-7992-492a-8d7f-01c4fba79c1f","Type":"ContainerStarted","Data":"88ce9657f0763ea61b425bc6c855074133e212abd391123662cac4c734b21741"} Apr 17 16:25:38.276138 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:25:38.276081 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-w5fn9" podStartSLOduration=1.35287703 podStartE2EDuration="4.276066834s" podCreationTimestamp="2026-04-17 16:25:34 +0000 UTC" firstStartedPulling="2026-04-17 16:25:34.77119528 +0000 UTC m=+365.145108200" lastFinishedPulling="2026-04-17 16:25:37.694385084 +0000 UTC m=+368.068298004" observedRunningTime="2026-04-17 16:25:38.27586923 +0000 UTC m=+368.649782173" watchObservedRunningTime="2026-04-17 16:25:38.276066834 +0000 UTC m=+368.649979778" Apr 17 16:26:11.663840 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.663801 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt"] Apr 17 16:26:11.670012 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.669993 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.672998 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.672975 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:26:11.673530 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.673483 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:26:11.673753 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.673732 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:26:11.673838 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.673782 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:26:11.674187 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.674173 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jlzxq\"" Apr 17 16:26:11.688974 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.688950 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt"] Apr 17 16:26:11.806820 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.806784 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhrs\" (UniqueName: \"kubernetes.io/projected/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-kube-api-access-rjhrs\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.806977 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.806828 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.806977 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.806851 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.908009 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.907980 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhrs\" (UniqueName: \"kubernetes.io/projected/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-kube-api-access-rjhrs\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.908083 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.908021 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.908083 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.908043 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.910622 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.910594 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.910622 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.910619 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.917445 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.917397 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhrs\" (UniqueName: \"kubernetes.io/projected/3a6a74e3-cc67-487a-9daa-af05e0bf63bd-kube-api-access-rjhrs\") pod \"opendatahub-operator-controller-manager-54994d49cf-9dnxt\" (UID: \"3a6a74e3-cc67-487a-9daa-af05e0bf63bd\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:11.980181 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:11.980156 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:12.100988 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:12.100955 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt"] Apr 17 16:26:12.104414 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:26:12.104383 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a6a74e3_cc67_487a_9daa_af05e0bf63bd.slice/crio-94a0c1da89148e9438f988d19a43362bb6c557f60806ec7c5b87611131941d3e WatchSource:0}: Error finding container 94a0c1da89148e9438f988d19a43362bb6c557f60806ec7c5b87611131941d3e: Status 404 returned error can't find the container with id 94a0c1da89148e9438f988d19a43362bb6c557f60806ec7c5b87611131941d3e Apr 17 16:26:12.343770 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:12.343739 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" event={"ID":"3a6a74e3-cc67-487a-9daa-af05e0bf63bd","Type":"ContainerStarted","Data":"94a0c1da89148e9438f988d19a43362bb6c557f60806ec7c5b87611131941d3e"} Apr 17 16:26:15.352348 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:15.352312 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" event={"ID":"3a6a74e3-cc67-487a-9daa-af05e0bf63bd","Type":"ContainerStarted","Data":"71a998bee24f4b1d8f7c7f4493c666c6477be430db4c113c4bfe9a34e1be856f"} Apr 17 16:26:15.352833 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:15.352430 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:15.378251 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:15.378194 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" podStartSLOduration=1.854319856 podStartE2EDuration="4.378175037s" podCreationTimestamp="2026-04-17 16:26:11 +0000 UTC" firstStartedPulling="2026-04-17 16:26:12.106295453 +0000 UTC m=+402.480208373" lastFinishedPulling="2026-04-17 16:26:14.630150629 +0000 UTC m=+405.004063554" observedRunningTime="2026-04-17 16:26:15.374888269 +0000 UTC m=+405.748801214" watchObservedRunningTime="2026-04-17 16:26:15.378175037 +0000 UTC m=+405.752087979" Apr 17 16:26:26.357420 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:26.357390 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-9dnxt" Apr 17 16:26:29.601660 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.601626 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws"] Apr 17 16:26:29.604690 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.604670 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.607281 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.607255 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:26:29.608405 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.608385 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:26:29.608541 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.608390 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 16:26:29.608541 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.608478 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 16:26:29.608541 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.608390 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-c5cck\"" Apr 17 16:26:29.614251 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.614218 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws"] Apr 17 16:26:29.737300 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.737266 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93601d38-a19b-45fc-afc9-2d7991a1db51-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.737480 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.737302 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75pd\" (UniqueName: \"kubernetes.io/projected/93601d38-a19b-45fc-afc9-2d7991a1db51-kube-api-access-f75pd\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.737480 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.737381 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93601d38-a19b-45fc-afc9-2d7991a1db51-tmp\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.838278 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.838251 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93601d38-a19b-45fc-afc9-2d7991a1db51-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.838409 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.838285 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f75pd\" (UniqueName: \"kubernetes.io/projected/93601d38-a19b-45fc-afc9-2d7991a1db51-kube-api-access-f75pd\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.838409 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.838318 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93601d38-a19b-45fc-afc9-2d7991a1db51-tmp\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.840518 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.840475 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93601d38-a19b-45fc-afc9-2d7991a1db51-tmp\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.840669 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.840650 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93601d38-a19b-45fc-afc9-2d7991a1db51-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.847516 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.847473 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75pd\" (UniqueName: \"kubernetes.io/projected/93601d38-a19b-45fc-afc9-2d7991a1db51-kube-api-access-f75pd\") pod \"kube-auth-proxy-667bf5bb7-p2bws\" (UID: \"93601d38-a19b-45fc-afc9-2d7991a1db51\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:29.914594 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:29.914523 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" Apr 17 16:26:30.033546 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:30.033523 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws"] Apr 17 16:26:30.035763 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:26:30.035740 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93601d38_a19b_45fc_afc9_2d7991a1db51.slice/crio-37aed7db0634c3e63fcaf01a4bcaf7231ddf38e1ec675f95a22e87e96cbc796b WatchSource:0}: Error finding container 37aed7db0634c3e63fcaf01a4bcaf7231ddf38e1ec675f95a22e87e96cbc796b: Status 404 returned error can't find the container with id 37aed7db0634c3e63fcaf01a4bcaf7231ddf38e1ec675f95a22e87e96cbc796b Apr 17 16:26:30.392914 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:30.392876 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" event={"ID":"93601d38-a19b-45fc-afc9-2d7991a1db51","Type":"ContainerStarted","Data":"37aed7db0634c3e63fcaf01a4bcaf7231ddf38e1ec675f95a22e87e96cbc796b"} Apr 17 16:26:32.534344 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.534308 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6wdhg"] Apr 17 16:26:32.537322 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.537305 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:32.539622 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.539604 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-hxjtg\"" Apr 17 16:26:32.539708 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.539647 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 16:26:32.545123 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.545104 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6wdhg"] Apr 17 16:26:32.659483 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.659448 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwr5\" (UniqueName: \"kubernetes.io/projected/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-kube-api-access-tvwr5\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:32.659675 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.659542 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:32.760149 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.760107 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:32.760311 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.760182 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwr5\" (UniqueName: \"kubernetes.io/projected/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-kube-api-access-tvwr5\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:32.760311 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:32.760269 2584 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 16:26:32.760411 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:32.760361 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert podName:32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f nodeName:}" failed. No retries permitted until 2026-04-17 16:26:33.260339661 +0000 UTC m=+423.634252582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert") pod "odh-model-controller-858dbf95b8-6wdhg" (UID: "32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f") : secret "odh-model-controller-webhook-cert" not found Apr 17 16:26:32.769392 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:32.769367 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwr5\" (UniqueName: \"kubernetes.io/projected/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-kube-api-access-tvwr5\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:33.265159 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:33.265082 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:33.265358 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:33.265223 2584 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 16:26:33.265358 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:33.265302 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert podName:32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f nodeName:}" failed. No retries permitted until 2026-04-17 16:26:34.265282544 +0000 UTC m=+424.639195474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert") pod "odh-model-controller-858dbf95b8-6wdhg" (UID: "32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f") : secret "odh-model-controller-webhook-cert" not found Apr 17 16:26:33.397185 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:33.397161 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 16:26:34.275256 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.275220 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:34.277771 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.277747 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f-cert\") pod \"odh-model-controller-858dbf95b8-6wdhg\" (UID: \"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:34.347788 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.347756 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:34.405770 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.405676 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" event={"ID":"93601d38-a19b-45fc-afc9-2d7991a1db51","Type":"ContainerStarted","Data":"f261078395e935f506dd775d585fc3d770fc60eba58e7b75a9b6b5edfc84aa65"} Apr 17 16:26:34.421518 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.421434 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-p2bws" podStartSLOduration=2.064143173 podStartE2EDuration="5.421417761s" podCreationTimestamp="2026-04-17 16:26:29 +0000 UTC" firstStartedPulling="2026-04-17 16:26:30.037529989 +0000 UTC m=+420.411442912" lastFinishedPulling="2026-04-17 16:26:33.394804578 +0000 UTC m=+423.768717500" observedRunningTime="2026-04-17 16:26:34.420273704 +0000 UTC m=+424.794186645" watchObservedRunningTime="2026-04-17 16:26:34.421417761 +0000 UTC m=+424.795330703" Apr 17 16:26:34.467144 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:34.467080 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6wdhg"] Apr 17 16:26:34.470350 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:26:34.470317 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32683b8e_e0e0_4ffd_a2a6_473e0e8fb37f.slice/crio-acb31e4294d6e37359e45b85011b57461661697d90ca9994ca21e7dbf498efdc WatchSource:0}: Error finding container acb31e4294d6e37359e45b85011b57461661697d90ca9994ca21e7dbf498efdc: Status 404 returned error can't find the container with id acb31e4294d6e37359e45b85011b57461661697d90ca9994ca21e7dbf498efdc Apr 17 16:26:35.412011 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:35.411979 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" event={"ID":"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f","Type":"ContainerStarted","Data":"acb31e4294d6e37359e45b85011b57461661697d90ca9994ca21e7dbf498efdc"} Apr 17 16:26:38.421169 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.421124 2584 generic.go:358] "Generic (PLEG): container finished" podID="32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f" containerID="005cae332e5d062359e6f08e96d16c1f8ae6f1fc5ec236c2911c29ab154355ce" exitCode=1 Apr 17 16:26:38.421550 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.421217 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" event={"ID":"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f","Type":"ContainerDied","Data":"005cae332e5d062359e6f08e96d16c1f8ae6f1fc5ec236c2911c29ab154355ce"} Apr 17 16:26:38.421550 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.421454 2584 scope.go:117] "RemoveContainer" containerID="005cae332e5d062359e6f08e96d16c1f8ae6f1fc5ec236c2911c29ab154355ce" Apr 17 16:26:38.524508 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.524473 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zwdl2"] Apr 17 16:26:38.527544 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.527525 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:38.531213 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.531013 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 16:26:38.531213 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.531019 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-jbf9h\"" Apr 17 16:26:38.541675 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.541654 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zwdl2"] Apr 17 16:26:38.605739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.605706 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:38.605876 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.605747 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5g64\" (UniqueName: \"kubernetes.io/projected/19b8ddd2-b0ff-41cd-8200-02fafd91a837-kube-api-access-b5g64\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:38.707044 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.707022 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:38.707136 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.707055 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5g64\" (UniqueName: \"kubernetes.io/projected/19b8ddd2-b0ff-41cd-8200-02fafd91a837-kube-api-access-b5g64\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:38.707183 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:38.707154 2584 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 16:26:38.707220 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:38.707209 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert podName:19b8ddd2-b0ff-41cd-8200-02fafd91a837 nodeName:}" failed. No retries permitted until 2026-04-17 16:26:39.207193322 +0000 UTC m=+429.581106246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert") pod "kserve-controller-manager-856948b99f-zwdl2" (UID: "19b8ddd2-b0ff-41cd-8200-02fafd91a837") : secret "kserve-webhook-server-cert" not found Apr 17 16:26:38.716157 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:38.716135 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5g64\" (UniqueName: \"kubernetes.io/projected/19b8ddd2-b0ff-41cd-8200-02fafd91a837-kube-api-access-b5g64\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:39.210464 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.210434 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:39.212870 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.212848 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19b8ddd2-b0ff-41cd-8200-02fafd91a837-cert\") pod \"kserve-controller-manager-856948b99f-zwdl2\" (UID: \"19b8ddd2-b0ff-41cd-8200-02fafd91a837\") " pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:39.425819 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.425782 2584 generic.go:358] "Generic (PLEG): container finished" podID="32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f" containerID="9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb" exitCode=1 Apr 17 16:26:39.426247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.425874 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" event={"ID":"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f","Type":"ContainerDied","Data":"9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb"} Apr 17 16:26:39.426247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.425920 2584 scope.go:117] "RemoveContainer" containerID="005cae332e5d062359e6f08e96d16c1f8ae6f1fc5ec236c2911c29ab154355ce" Apr 17 16:26:39.426247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.426089 2584 scope.go:117] "RemoveContainer" containerID="9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb" Apr 17 16:26:39.426364 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:39.426295 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6wdhg_opendatahub(32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" podUID="32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f" Apr 17 16:26:39.440049 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.440033 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:39.555107 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:39.555069 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zwdl2"] Apr 17 16:26:39.557769 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:26:39.557743 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b8ddd2_b0ff_41cd_8200_02fafd91a837.slice/crio-146fdb9f6c0deef29367432371d3eb8d2ac85d76bba4ec425156035ab7167cbc WatchSource:0}: Error finding container 146fdb9f6c0deef29367432371d3eb8d2ac85d76bba4ec425156035ab7167cbc: Status 404 returned error can't find the container with id 146fdb9f6c0deef29367432371d3eb8d2ac85d76bba4ec425156035ab7167cbc Apr 17 16:26:40.429994 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:40.429956 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" event={"ID":"19b8ddd2-b0ff-41cd-8200-02fafd91a837","Type":"ContainerStarted","Data":"146fdb9f6c0deef29367432371d3eb8d2ac85d76bba4ec425156035ab7167cbc"} Apr 17 16:26:40.431509 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:40.431482 2584 scope.go:117] "RemoveContainer" containerID="9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb" Apr 17 16:26:40.431681 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:40.431665 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6wdhg_opendatahub(32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" podUID="32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f" Apr 17 16:26:42.438285 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:42.438254 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" event={"ID":"19b8ddd2-b0ff-41cd-8200-02fafd91a837","Type":"ContainerStarted","Data":"3e8f6b7b222ef07012a059637bf35602dc5db66a50ccc420028536a56374b6ab"} Apr 17 16:26:42.438709 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:42.438372 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:26:42.463301 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:42.463248 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" podStartSLOduration=1.684679135 podStartE2EDuration="4.463230797s" podCreationTimestamp="2026-04-17 16:26:38 +0000 UTC" firstStartedPulling="2026-04-17 16:26:39.559072469 +0000 UTC m=+429.932985389" lastFinishedPulling="2026-04-17 16:26:42.337624128 +0000 UTC m=+432.711537051" observedRunningTime="2026-04-17 16:26:42.460207669 +0000 UTC m=+432.834120611" watchObservedRunningTime="2026-04-17 16:26:42.463230797 +0000 UTC m=+432.837143743" Apr 17 16:26:44.171394 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.171360 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr"] Apr 17 16:26:44.174632 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.174616 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.177370 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.177349 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 16:26:44.177635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.177617 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 16:26:44.177909 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.177882 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-642sg\"" Apr 17 16:26:44.189124 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.189104 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr"] Apr 17 16:26:44.249042 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.249007 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.249042 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.249049 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf56m\" (UniqueName: \"kubernetes.io/projected/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-kube-api-access-jf56m\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.348216 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.348187 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:44.348568 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.348554 2584 scope.go:117] "RemoveContainer" containerID="9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb" Apr 17 16:26:44.348755 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:26:44.348738 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6wdhg_opendatahub(32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" podUID="32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f" Apr 17 16:26:44.349749 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.349726 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.349794 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.349762 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf56m\" (UniqueName: \"kubernetes.io/projected/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-kube-api-access-jf56m\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.352404 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.352385 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.362359 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.362337 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf56m\" (UniqueName: \"kubernetes.io/projected/f42db198-c0d0-4a44-98f8-a81ef7bbabd8-kube-api-access-jf56m\") pod \"servicemesh-operator3-55f49c5f94-nrrkr\" (UID: \"f42db198-c0d0-4a44-98f8-a81ef7bbabd8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.484914 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.484825 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:44.611869 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:44.611704 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr"] Apr 17 16:26:44.614768 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:26:44.614735 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42db198_c0d0_4a44_98f8_a81ef7bbabd8.slice/crio-9cb2e83265603be59952a3af5934293a6132abf8d907ef0ef39ac0349b7c9fa5 WatchSource:0}: Error finding container 9cb2e83265603be59952a3af5934293a6132abf8d907ef0ef39ac0349b7c9fa5: Status 404 returned error can't find the container with id 9cb2e83265603be59952a3af5934293a6132abf8d907ef0ef39ac0349b7c9fa5 Apr 17 16:26:45.449851 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:45.449813 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" event={"ID":"f42db198-c0d0-4a44-98f8-a81ef7bbabd8","Type":"ContainerStarted","Data":"9cb2e83265603be59952a3af5934293a6132abf8d907ef0ef39ac0349b7c9fa5"} Apr 17 16:26:47.459604 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:47.459513 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" event={"ID":"f42db198-c0d0-4a44-98f8-a81ef7bbabd8","Type":"ContainerStarted","Data":"3ea53468a8a31a66d62f985e0165a89f206823da6199ac6c03a8ffecb4b9b8a7"} Apr 17 16:26:47.459604 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:47.459587 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:47.481257 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:47.479454 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" podStartSLOduration=0.965297226 podStartE2EDuration="3.479439628s" podCreationTimestamp="2026-04-17 16:26:44 +0000 UTC" firstStartedPulling="2026-04-17 16:26:44.617421971 +0000 UTC m=+434.991334890" lastFinishedPulling="2026-04-17 16:26:47.131564372 +0000 UTC m=+437.505477292" observedRunningTime="2026-04-17 16:26:47.477772979 +0000 UTC m=+437.851685920" watchObservedRunningTime="2026-04-17 16:26:47.479439628 +0000 UTC m=+437.853352574" Apr 17 16:26:54.348552 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:54.348511 2584 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:54.348908 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:54.348858 2584 scope.go:117] "RemoveContainer" containerID="9b940e0b0aca7f43f5c677fa2f24b48db90c48ca62de36ba4bde5cdc9a9cb2fb" Apr 17 16:26:55.487023 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:55.486984 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" event={"ID":"32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f","Type":"ContainerStarted","Data":"eb6ef45ed2c45023bee0edbbf1436e103648aabcbb90df4583b30cdacebd0e61"} Apr 17 16:26:55.487413 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:55.487181 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:26:55.504681 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:55.504626 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" podStartSLOduration=3.368526267 podStartE2EDuration="23.504611671s" podCreationTimestamp="2026-04-17 16:26:32 +0000 UTC" firstStartedPulling="2026-04-17 16:26:34.472179569 +0000 UTC m=+424.846092488" lastFinishedPulling="2026-04-17 16:26:54.608264973 +0000 UTC m=+444.982177892" observedRunningTime="2026-04-17 16:26:55.504299639 +0000 UTC m=+445.878212583" watchObservedRunningTime="2026-04-17 16:26:55.504611671 +0000 UTC m=+445.878524615" Apr 17 16:26:58.465309 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:58.465280 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nrrkr" Apr 17 16:26:59.882235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.882205 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v"] Apr 17 16:26:59.885283 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.885262 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.888757 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.888736 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 16:26:59.889405 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.889383 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:26:59.889661 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.889646 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 16:26:59.889990 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.889972 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-9hdpc\"" Apr 17 16:26:59.890085 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.890011 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 16:26:59.907767 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.907738 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v"] Apr 17 16:26:59.963354 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963326 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963543 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963371 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/418141b9-bb94-4b5f-a9bc-ec3afc664f02-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963543 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963390 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963543 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963447 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963543 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963521 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963810 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963585 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:26:59.963810 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:26:59.963610 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl54t\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-kube-api-access-tl54t\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.064798 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.064764 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.064798 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.064802 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl54t\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-kube-api-access-tl54t\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.065064 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.064830 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.065064 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.064879 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/418141b9-bb94-4b5f-a9bc-ec3afc664f02-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.065064 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.064900 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.065213 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.065101 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.065213 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.065169 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.066326 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.066303 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.067887 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.067859 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.067992 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.067920 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.068409 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.068388 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/418141b9-bb94-4b5f-a9bc-ec3afc664f02-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.068755 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.068737 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.079670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.079633 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.079810 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.079793 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl54t\" (UniqueName: \"kubernetes.io/projected/418141b9-bb94-4b5f-a9bc-ec3afc664f02-kube-api-access-tl54t\") pod \"istiod-openshift-gateway-55ff986f96-xmd2v\" (UID: \"418141b9-bb94-4b5f-a9bc-ec3afc664f02\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.195108 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.195021 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:00.330116 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.329982 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v"] Apr 17 16:27:00.332367 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:27:00.332337 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418141b9_bb94_4b5f_a9bc_ec3afc664f02.slice/crio-ddc11b3092d970ab6458f001d9540362737f7975cf102bd0be68f743f53c86f0 WatchSource:0}: Error finding container ddc11b3092d970ab6458f001d9540362737f7975cf102bd0be68f743f53c86f0: Status 404 returned error can't find the container with id ddc11b3092d970ab6458f001d9540362737f7975cf102bd0be68f743f53c86f0 Apr 17 16:27:00.503320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:00.503231 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" event={"ID":"418141b9-bb94-4b5f-a9bc-ec3afc664f02","Type":"ContainerStarted","Data":"ddc11b3092d970ab6458f001d9540362737f7975cf102bd0be68f743f53c86f0"} Apr 17 16:27:03.180307 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:03.180252 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892168Ki","pods":"250"} Apr 17 16:27:03.180693 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:03.180341 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892168Ki","pods":"250"} Apr 17 16:27:03.515555 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:03.515466 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" event={"ID":"418141b9-bb94-4b5f-a9bc-ec3afc664f02","Type":"ContainerStarted","Data":"91454f11ce0d189eb2905ea92086f93c7be8bf0918b9536851ee0e4f4466ba51"} Apr 17 16:27:03.515693 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:03.515598 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:03.536345 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:03.536293 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" podStartSLOduration=1.690553209 podStartE2EDuration="4.536277933s" podCreationTimestamp="2026-04-17 16:26:59 +0000 UTC" firstStartedPulling="2026-04-17 16:27:00.33426898 +0000 UTC m=+450.708181899" lastFinishedPulling="2026-04-17 16:27:03.17999369 +0000 UTC m=+453.553906623" observedRunningTime="2026-04-17 16:27:03.534082617 +0000 UTC m=+453.907995576" watchObservedRunningTime="2026-04-17 16:27:03.536277933 +0000 UTC m=+453.910190874" Apr 17 16:27:04.521042 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:04.521001 2584 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-xmd2v container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 16:27:04.521419 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:04.521063 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" podUID="418141b9-bb94-4b5f-a9bc-ec3afc664f02" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:27:06.492845 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:06.492770 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-6wdhg" Apr 17 16:27:07.520839 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:07.520804 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xmd2v" Apr 17 16:27:13.446279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:27:13.446248 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-zwdl2" Apr 17 16:28:04.920776 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.920741 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp"] Apr 17 16:28:04.922745 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.922729 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:04.925321 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.925302 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 16:28:04.925424 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.925395 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:28:04.926565 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.926545 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-4x6zz\"" Apr 17 16:28:04.926676 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.926545 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:28:04.931886 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:04.931863 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp"] Apr 17 16:28:05.041035 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.040998 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jsr\" (UniqueName: \"kubernetes.io/projected/a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed-kube-api-access-l9jsr\") pod \"dns-operator-controller-manager-648d5c98bc-bffnp\" (UID: \"a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:05.141869 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.141837 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jsr\" (UniqueName: \"kubernetes.io/projected/a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed-kube-api-access-l9jsr\") pod \"dns-operator-controller-manager-648d5c98bc-bffnp\" (UID: \"a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:05.151267 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.151242 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jsr\" (UniqueName: \"kubernetes.io/projected/a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed-kube-api-access-l9jsr\") pod \"dns-operator-controller-manager-648d5c98bc-bffnp\" (UID: \"a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:05.234099 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.234031 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:05.364548 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.364527 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp"] Apr 17 16:28:05.366670 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:28:05.366640 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6614bb7_4d04_4a5b_b0ff_6af39b78c2ed.slice/crio-0eff4c276d4e3adeb2245299cc7c6f369260aad3881f1656883eb9c7941f890d WatchSource:0}: Error finding container 0eff4c276d4e3adeb2245299cc7c6f369260aad3881f1656883eb9c7941f890d: Status 404 returned error can't find the container with id 0eff4c276d4e3adeb2245299cc7c6f369260aad3881f1656883eb9c7941f890d Apr 17 16:28:05.701741 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:05.701702 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" event={"ID":"a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed","Type":"ContainerStarted","Data":"0eff4c276d4e3adeb2245299cc7c6f369260aad3881f1656883eb9c7941f890d"} Apr 17 16:28:08.217191 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.217158 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6h9gv"] Apr 17 16:28:08.220485 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.220463 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:08.226742 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.226716 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5lht7\"" Apr 17 16:28:08.233163 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.233142 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6h9gv"] Apr 17 16:28:08.265786 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.265754 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29dxf\" (UniqueName: \"kubernetes.io/projected/769b3620-4b0f-499b-b643-454c071e007d-kube-api-access-29dxf\") pod \"authorino-operator-657f44b778-6h9gv\" (UID: \"769b3620-4b0f-499b-b643-454c071e007d\") " pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:08.366328 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.366288 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29dxf\" (UniqueName: \"kubernetes.io/projected/769b3620-4b0f-499b-b643-454c071e007d-kube-api-access-29dxf\") pod \"authorino-operator-657f44b778-6h9gv\" (UID: \"769b3620-4b0f-499b-b643-454c071e007d\") " pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:08.374794 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.374764 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dxf\" (UniqueName: \"kubernetes.io/projected/769b3620-4b0f-499b-b643-454c071e007d-kube-api-access-29dxf\") pod \"authorino-operator-657f44b778-6h9gv\" (UID: \"769b3620-4b0f-499b-b643-454c071e007d\") " pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:08.530389 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.530365 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:08.657152 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.657122 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6h9gv"] Apr 17 16:28:08.660394 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:28:08.660362 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769b3620_4b0f_499b_b643_454c071e007d.slice/crio-cd383695d89dda597b8cb7ec24800f909a4d374161206c2ed900e4bf47e4a0cf WatchSource:0}: Error finding container cd383695d89dda597b8cb7ec24800f909a4d374161206c2ed900e4bf47e4a0cf: Status 404 returned error can't find the container with id cd383695d89dda597b8cb7ec24800f909a4d374161206c2ed900e4bf47e4a0cf Apr 17 16:28:08.715905 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.715870 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" event={"ID":"769b3620-4b0f-499b-b643-454c071e007d","Type":"ContainerStarted","Data":"cd383695d89dda597b8cb7ec24800f909a4d374161206c2ed900e4bf47e4a0cf"} Apr 17 16:28:08.717104 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.717079 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" event={"ID":"a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed","Type":"ContainerStarted","Data":"25dd2ffb15c5553daf89502150d3dab19a0e99678f6ec131b1b68d8ffe58e93c"} Apr 17 16:28:08.717229 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.717215 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:08.733478 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:08.733440 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" podStartSLOduration=2.330187702 podStartE2EDuration="4.733428616s" podCreationTimestamp="2026-04-17 16:28:04 +0000 UTC" firstStartedPulling="2026-04-17 16:28:05.368680031 +0000 UTC m=+515.742592952" lastFinishedPulling="2026-04-17 16:28:07.771920933 +0000 UTC m=+518.145833866" observedRunningTime="2026-04-17 16:28:08.732315861 +0000 UTC m=+519.106228806" watchObservedRunningTime="2026-04-17 16:28:08.733428616 +0000 UTC m=+519.107341643" Apr 17 16:28:10.726557 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:10.726448 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" event={"ID":"769b3620-4b0f-499b-b643-454c071e007d","Type":"ContainerStarted","Data":"cade3523adcee2c69cd01e12766d4bd49452b0041316467e1032bff21b940d9d"} Apr 17 16:28:10.726923 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:10.726569 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:10.742963 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:10.742918 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" podStartSLOduration=1.074191677 podStartE2EDuration="2.742905358s" podCreationTimestamp="2026-04-17 16:28:08 +0000 UTC" firstStartedPulling="2026-04-17 16:28:08.662401698 +0000 UTC m=+519.036314621" lastFinishedPulling="2026-04-17 16:28:10.331115379 +0000 UTC m=+520.705028302" observedRunningTime="2026-04-17 16:28:10.740924691 +0000 UTC m=+521.114837636" watchObservedRunningTime="2026-04-17 16:28:10.742905358 +0000 UTC m=+521.116818300" Apr 17 16:28:19.724053 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:19.723973 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bffnp" Apr 17 16:28:21.057353 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.057316 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg"] Apr 17 16:28:21.060257 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.060242 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:21.062779 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.062758 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-q2z79\"" Apr 17 16:28:21.072646 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.072624 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg"] Apr 17 16:28:21.162186 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.162158 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxdw\" (UniqueName: \"kubernetes.io/projected/e10ece49-2669-4bfc-9c02-3836ad5abb9f-kube-api-access-ccxdw\") pod \"limitador-operator-controller-manager-85c4996f8c-9pzjg\" (UID: \"e10ece49-2669-4bfc-9c02-3836ad5abb9f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:21.263162 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.263121 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxdw\" (UniqueName: \"kubernetes.io/projected/e10ece49-2669-4bfc-9c02-3836ad5abb9f-kube-api-access-ccxdw\") pod \"limitador-operator-controller-manager-85c4996f8c-9pzjg\" (UID: \"e10ece49-2669-4bfc-9c02-3836ad5abb9f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:21.276615 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.276591 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxdw\" (UniqueName: \"kubernetes.io/projected/e10ece49-2669-4bfc-9c02-3836ad5abb9f-kube-api-access-ccxdw\") pod \"limitador-operator-controller-manager-85c4996f8c-9pzjg\" (UID: \"e10ece49-2669-4bfc-9c02-3836ad5abb9f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:21.369596 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.369526 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:21.484935 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.484912 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg"] Apr 17 16:28:21.487470 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:28:21.487440 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode10ece49_2669_4bfc_9c02_3836ad5abb9f.slice/crio-7922bb01a07e67498d8085a7ee95316f6bb48c034f22c6cb4c1de29195d65998 WatchSource:0}: Error finding container 7922bb01a07e67498d8085a7ee95316f6bb48c034f22c6cb4c1de29195d65998: Status 404 returned error can't find the container with id 7922bb01a07e67498d8085a7ee95316f6bb48c034f22c6cb4c1de29195d65998 Apr 17 16:28:21.732343 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.732267 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-6h9gv" Apr 17 16:28:21.761043 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:21.761009 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" event={"ID":"e10ece49-2669-4bfc-9c02-3836ad5abb9f","Type":"ContainerStarted","Data":"7922bb01a07e67498d8085a7ee95316f6bb48c034f22c6cb4c1de29195d65998"} Apr 17 16:28:23.770466 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:23.770436 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" event={"ID":"e10ece49-2669-4bfc-9c02-3836ad5abb9f","Type":"ContainerStarted","Data":"a445c24acd5cca538dd2a1e50dd7d423a5f4763a841ea811876de235fe4ef676"} Apr 17 16:28:23.770837 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:23.770530 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:28:23.790997 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:23.790946 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" podStartSLOduration=1.221242417 podStartE2EDuration="2.790932168s" podCreationTimestamp="2026-04-17 16:28:21 +0000 UTC" firstStartedPulling="2026-04-17 16:28:21.489408039 +0000 UTC m=+531.863320958" lastFinishedPulling="2026-04-17 16:28:23.059097786 +0000 UTC m=+533.433010709" observedRunningTime="2026-04-17 16:28:23.790578439 +0000 UTC m=+534.164491380" watchObservedRunningTime="2026-04-17 16:28:23.790932168 +0000 UTC m=+534.164845111" Apr 17 16:28:34.775775 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:28:34.775740 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9pzjg" Apr 17 16:29:31.572114 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.572085 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:29:31.575223 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.575207 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:31.577760 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.577738 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-z7k5r\"" Apr 17 16:29:31.582037 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.582014 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:29:31.695077 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.695046 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h85b\" (UniqueName: \"kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b\") pod \"maas-controller-d7886bf94-hsr54\" (UID: \"98289d93-3295-41f7-bf90-ef56a131aa1e\") " pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:31.796187 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.796156 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h85b\" (UniqueName: \"kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b\") pod \"maas-controller-d7886bf94-hsr54\" (UID: \"98289d93-3295-41f7-bf90-ef56a131aa1e\") " pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:31.804351 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.804318 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h85b\" (UniqueName: \"kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b\") pod \"maas-controller-d7886bf94-hsr54\" (UID: \"98289d93-3295-41f7-bf90-ef56a131aa1e\") " pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:31.886242 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:31.886182 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:32.004712 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:32.004586 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:29:32.007277 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:29:32.007250 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98289d93_3295_41f7_bf90_ef56a131aa1e.slice/crio-2cd1f5144bf406d8de8a154ca26fed362b834d033ed59dbb8507f470fc83b5eb WatchSource:0}: Error finding container 2cd1f5144bf406d8de8a154ca26fed362b834d033ed59dbb8507f470fc83b5eb: Status 404 returned error can't find the container with id 2cd1f5144bf406d8de8a154ca26fed362b834d033ed59dbb8507f470fc83b5eb Apr 17 16:29:32.991012 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:32.990964 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d7886bf94-hsr54" event={"ID":"98289d93-3295-41f7-bf90-ef56a131aa1e","Type":"ContainerStarted","Data":"2cd1f5144bf406d8de8a154ca26fed362b834d033ed59dbb8507f470fc83b5eb"} Apr 17 16:29:34.999134 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:34.999099 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d7886bf94-hsr54" event={"ID":"98289d93-3295-41f7-bf90-ef56a131aa1e","Type":"ContainerStarted","Data":"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946"} Apr 17 16:29:34.999579 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:34.999215 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:35.018752 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:35.018699 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-d7886bf94-hsr54" podStartSLOduration=1.253713039 podStartE2EDuration="4.018682866s" podCreationTimestamp="2026-04-17 16:29:31 +0000 UTC" firstStartedPulling="2026-04-17 16:29:32.008643481 +0000 UTC m=+602.382556415" lastFinishedPulling="2026-04-17 16:29:34.773613312 +0000 UTC m=+605.147526242" observedRunningTime="2026-04-17 16:29:35.016868588 +0000 UTC m=+605.390781531" watchObservedRunningTime="2026-04-17 16:29:35.018682866 +0000 UTC m=+605.392595809" Apr 17 16:29:46.007318 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:46.007282 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:58.485269 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.485194 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:29:58.485652 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.485411 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-d7886bf94-hsr54" podUID="98289d93-3295-41f7-bf90-ef56a131aa1e" containerName="manager" containerID="cri-o://ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946" gracePeriod=10 Apr 17 16:29:58.715575 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.715547 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:58.771967 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.771904 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h85b\" (UniqueName: \"kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b\") pod \"98289d93-3295-41f7-bf90-ef56a131aa1e\" (UID: \"98289d93-3295-41f7-bf90-ef56a131aa1e\") " Apr 17 16:29:58.774077 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.774048 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b" (OuterVolumeSpecName: "kube-api-access-2h85b") pod "98289d93-3295-41f7-bf90-ef56a131aa1e" (UID: "98289d93-3295-41f7-bf90-ef56a131aa1e"). InnerVolumeSpecName "kube-api-access-2h85b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:58.872567 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:58.872540 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h85b\" (UniqueName: \"kubernetes.io/projected/98289d93-3295-41f7-bf90-ef56a131aa1e-kube-api-access-2h85b\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:29:59.079192 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.079158 2584 generic.go:358] "Generic (PLEG): container finished" podID="98289d93-3295-41f7-bf90-ef56a131aa1e" containerID="ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946" exitCode=0 Apr 17 16:29:59.079320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.079210 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d7886bf94-hsr54" Apr 17 16:29:59.079320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.079240 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d7886bf94-hsr54" event={"ID":"98289d93-3295-41f7-bf90-ef56a131aa1e","Type":"ContainerDied","Data":"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946"} Apr 17 16:29:59.079320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.079277 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d7886bf94-hsr54" event={"ID":"98289d93-3295-41f7-bf90-ef56a131aa1e","Type":"ContainerDied","Data":"2cd1f5144bf406d8de8a154ca26fed362b834d033ed59dbb8507f470fc83b5eb"} Apr 17 16:29:59.079320 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.079293 2584 scope.go:117] "RemoveContainer" containerID="ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946" Apr 17 16:29:59.087719 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.087702 2584 scope.go:117] "RemoveContainer" containerID="ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946" Apr 17 16:29:59.087971 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:29:59.087947 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946\": container with ID starting with ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946 not found: ID does not exist" containerID="ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946" Apr 17 16:29:59.088015 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.087979 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946"} err="failed to get container status \"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946\": rpc error: code = NotFound desc = could not find container \"ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946\": container with ID starting with ef0939f02a679ba43e716d0d057b676398c363a2b60c2625f1259898ca912946 not found: ID does not exist" Apr 17 16:29:59.099091 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.099061 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:29:59.103249 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:29:59.103229 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-d7886bf94-hsr54"] Apr 17 16:30:00.136212 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.136182 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:30:00.136616 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.136508 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98289d93-3295-41f7-bf90-ef56a131aa1e" containerName="manager" Apr 17 16:30:00.136616 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.136523 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="98289d93-3295-41f7-bf90-ef56a131aa1e" containerName="manager" Apr 17 16:30:00.136616 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.136610 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="98289d93-3295-41f7-bf90-ef56a131aa1e" containerName="manager" Apr 17 16:30:00.140825 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.140802 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:30:00.143413 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.143391 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ps47p\"" Apr 17 16:30:00.148266 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.147989 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:30:00.180680 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.180650 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5l8\" (UniqueName: \"kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8\") pod \"maas-api-key-cleanup-29607390-7w2nz\" (UID: \"9aba66a5-0d65-420a-a453-0a383587784e\") " pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:30:00.190018 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.189979 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98289d93-3295-41f7-bf90-ef56a131aa1e" path="/var/lib/kubelet/pods/98289d93-3295-41f7-bf90-ef56a131aa1e/volumes" Apr 17 16:30:00.281200 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.281159 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5l8\" (UniqueName: \"kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8\") pod \"maas-api-key-cleanup-29607390-7w2nz\" (UID: \"9aba66a5-0d65-420a-a453-0a383587784e\") " pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:30:00.289457 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.289431 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5l8\" (UniqueName: \"kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8\") pod \"maas-api-key-cleanup-29607390-7w2nz\" (UID: \"9aba66a5-0d65-420a-a453-0a383587784e\") " pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:30:00.452589 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.452472 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:30:00.574385 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:00.574352 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:30:00.577566 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:30:00.577537 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aba66a5_0d65_420a_a453_0a383587784e.slice/crio-f727fb7b365d92737dfa32b492646c630e68a5dca096a83f8a70ba5aaa7125f8 WatchSource:0}: Error finding container f727fb7b365d92737dfa32b492646c630e68a5dca096a83f8a70ba5aaa7125f8: Status 404 returned error can't find the container with id f727fb7b365d92737dfa32b492646c630e68a5dca096a83f8a70ba5aaa7125f8 Apr 17 16:30:01.086801 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:01.086764 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerStarted","Data":"f727fb7b365d92737dfa32b492646c630e68a5dca096a83f8a70ba5aaa7125f8"} Apr 17 16:30:02.090635 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:02.090601 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerStarted","Data":"955a3df1d7b3710d621ad13386402d8df76f35292e3d21c67f6ce8316ba3dba4"} Apr 17 16:30:02.104912 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:02.104859 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" podStartSLOduration=1.7471340020000001 podStartE2EDuration="2.104842968s" podCreationTimestamp="2026-04-17 16:30:00 +0000 UTC" firstStartedPulling="2026-04-17 16:30:00.579253522 +0000 UTC m=+630.953166443" lastFinishedPulling="2026-04-17 16:30:00.936962484 +0000 UTC m=+631.310875409" observedRunningTime="2026-04-17 16:30:02.104483026 +0000 UTC m=+632.478395980" watchObservedRunningTime="2026-04-17 16:30:02.104842968 +0000 UTC m=+632.478755912" Apr 17 16:30:15.413587 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.410906 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp"] Apr 17 16:30:15.419492 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.419466 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.422235 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.422205 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 16:30:15.422353 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.422282 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 16:30:15.422604 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.422586 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp"] Apr 17 16:30:15.423281 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.423265 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-7xrks\"" Apr 17 16:30:15.423362 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.423291 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 16:30:15.481955 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.481922 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.482113 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.481966 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.482113 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.482005 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49a52107-4f8c-4824-af45-04d8868e7e5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.482113 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.482032 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4l84\" (UniqueName: \"kubernetes.io/projected/49a52107-4f8c-4824-af45-04d8868e7e5d-kube-api-access-d4l84\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.482113 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.482073 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.482113 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.482089 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582421 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582387 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582421 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582421 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582448 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582465 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582539 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49a52107-4f8c-4824-af45-04d8868e7e5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582670 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582580 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4l84\" (UniqueName: \"kubernetes.io/projected/49a52107-4f8c-4824-af45-04d8868e7e5d-kube-api-access-d4l84\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582867 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582810 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582922 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582892 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.582968 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.582952 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.584906 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.584884 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49a52107-4f8c-4824-af45-04d8868e7e5d-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.585000 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.584965 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49a52107-4f8c-4824-af45-04d8868e7e5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.590243 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.590220 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4l84\" (UniqueName: \"kubernetes.io/projected/49a52107-4f8c-4824-af45-04d8868e7e5d-kube-api-access-d4l84\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-r7sfp\" (UID: \"49a52107-4f8c-4824-af45-04d8868e7e5d\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.698699 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.698621 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5"] Apr 17 16:30:15.702458 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.702435 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.704731 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.704710 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 16:30:15.711749 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.711723 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5"] Apr 17 16:30:15.730620 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.730597 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:15.785076 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785041 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.785209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785093 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.785209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785131 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.785209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785157 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd14f93-c028-4746-87f3-c785e8b01b3c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.785209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785182 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.785209 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.785205 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntvr\" (UniqueName: \"kubernetes.io/projected/6fd14f93-c028-4746-87f3-c785e8b01b3c-kube-api-access-hntvr\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886006 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.885972 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886145 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886038 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886145 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886081 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886145 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886113 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd14f93-c028-4746-87f3-c785e8b01b3c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886145 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886140 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886353 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886168 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hntvr\" (UniqueName: \"kubernetes.io/projected/6fd14f93-c028-4746-87f3-c785e8b01b3c-kube-api-access-hntvr\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886410 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886387 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp"] Apr 17 16:30:15.886492 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886466 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.886676 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.886658 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.887285 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.887260 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.888289 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:30:15.888264 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a52107_4f8c_4824_af45_04d8868e7e5d.slice/crio-4d085fb431017251098515a0902b3eb8431f9a5634bb3623983627d45d9ea787 WatchSource:0}: Error finding container 4d085fb431017251098515a0902b3eb8431f9a5634bb3623983627d45d9ea787: Status 404 returned error can't find the container with id 4d085fb431017251098515a0902b3eb8431f9a5634bb3623983627d45d9ea787 Apr 17 16:30:15.888727 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.888708 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fd14f93-c028-4746-87f3-c785e8b01b3c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.888869 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.888853 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd14f93-c028-4746-87f3-c785e8b01b3c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:15.902575 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:15.902557 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntvr\" (UniqueName: \"kubernetes.io/projected/6fd14f93-c028-4746-87f3-c785e8b01b3c-kube-api-access-hntvr\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5\" (UID: \"6fd14f93-c028-4746-87f3-c785e8b01b3c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:16.012719 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:16.012631 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:16.135225 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:16.135192 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" event={"ID":"49a52107-4f8c-4824-af45-04d8868e7e5d","Type":"ContainerStarted","Data":"4d085fb431017251098515a0902b3eb8431f9a5634bb3623983627d45d9ea787"} Apr 17 16:30:16.141591 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:16.141567 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5"] Apr 17 16:30:16.143880 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:30:16.143851 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd14f93_c028_4746_87f3_c785e8b01b3c.slice/crio-192d4b8bf88705f3bad68c12faa776098af40780a572b7f69f63f627c7453ad7 WatchSource:0}: Error finding container 192d4b8bf88705f3bad68c12faa776098af40780a572b7f69f63f627c7453ad7: Status 404 returned error can't find the container with id 192d4b8bf88705f3bad68c12faa776098af40780a572b7f69f63f627c7453ad7 Apr 17 16:30:17.140353 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:17.140316 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" event={"ID":"6fd14f93-c028-4746-87f3-c785e8b01b3c","Type":"ContainerStarted","Data":"192d4b8bf88705f3bad68c12faa776098af40780a572b7f69f63f627c7453ad7"} Apr 17 16:30:23.165739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:23.164917 2584 generic.go:358] "Generic (PLEG): container finished" podID="9aba66a5-0d65-420a-a453-0a383587784e" containerID="955a3df1d7b3710d621ad13386402d8df76f35292e3d21c67f6ce8316ba3dba4" exitCode=6 Apr 17 16:30:23.165739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:23.165014 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerDied","Data":"955a3df1d7b3710d621ad13386402d8df76f35292e3d21c67f6ce8316ba3dba4"} Apr 17 16:30:23.165739 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:23.165318 2584 scope.go:117] "RemoveContainer" containerID="955a3df1d7b3710d621ad13386402d8df76f35292e3d21c67f6ce8316ba3dba4" Apr 17 16:30:23.168397 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:23.167803 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" event={"ID":"6fd14f93-c028-4746-87f3-c785e8b01b3c","Type":"ContainerStarted","Data":"b8d6116e91ff3bd32415bbae123e987bdca8d8d99595bf34208f68b7c467969f"} Apr 17 16:30:23.171251 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:23.170335 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" event={"ID":"49a52107-4f8c-4824-af45-04d8868e7e5d","Type":"ContainerStarted","Data":"104167c9930c967b3eeb4168322a21ab743227657b9b30f6cea058f1b9fee4fd"} Apr 17 16:30:24.176148 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:24.176108 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerStarted","Data":"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857"} Apr 17 16:30:29.194084 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:29.194048 2584 generic.go:358] "Generic (PLEG): container finished" podID="49a52107-4f8c-4824-af45-04d8868e7e5d" containerID="104167c9930c967b3eeb4168322a21ab743227657b9b30f6cea058f1b9fee4fd" exitCode=0 Apr 17 16:30:29.194422 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:29.194139 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" event={"ID":"49a52107-4f8c-4824-af45-04d8868e7e5d","Type":"ContainerDied","Data":"104167c9930c967b3eeb4168322a21ab743227657b9b30f6cea058f1b9fee4fd"} Apr 17 16:30:34.212608 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:34.212568 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" event={"ID":"49a52107-4f8c-4824-af45-04d8868e7e5d","Type":"ContainerStarted","Data":"82b3986f81d5f5ce417022b62f20528a09d031ce0ea97cdb63e2cb56c20229b2"} Apr 17 16:30:34.213037 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:34.212922 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:34.213797 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:34.213775 2584 generic.go:358] "Generic (PLEG): container finished" podID="6fd14f93-c028-4746-87f3-c785e8b01b3c" containerID="b8d6116e91ff3bd32415bbae123e987bdca8d8d99595bf34208f68b7c467969f" exitCode=0 Apr 17 16:30:34.213884 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:34.213830 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" event={"ID":"6fd14f93-c028-4746-87f3-c785e8b01b3c","Type":"ContainerDied","Data":"b8d6116e91ff3bd32415bbae123e987bdca8d8d99595bf34208f68b7c467969f"} Apr 17 16:30:34.232415 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:34.232366 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" podStartSLOduration=1.773016951 podStartE2EDuration="19.232352955s" podCreationTimestamp="2026-04-17 16:30:15 +0000 UTC" firstStartedPulling="2026-04-17 16:30:15.890299197 +0000 UTC m=+646.264212117" lastFinishedPulling="2026-04-17 16:30:33.349635184 +0000 UTC m=+663.723548121" observedRunningTime="2026-04-17 16:30:34.231106953 +0000 UTC m=+664.605019894" watchObservedRunningTime="2026-04-17 16:30:34.232352955 +0000 UTC m=+664.606265941" Apr 17 16:30:35.218195 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:35.218158 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" event={"ID":"6fd14f93-c028-4746-87f3-c785e8b01b3c","Type":"ContainerStarted","Data":"bb8b4b73e6b4c4ccb4940f0009b04b8c15235672f5af139d5cecd921decd68e6"} Apr 17 16:30:35.218675 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:35.218523 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:35.236260 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:35.236212 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" podStartSLOduration=1.985249488 podStartE2EDuration="20.236195342s" podCreationTimestamp="2026-04-17 16:30:15 +0000 UTC" firstStartedPulling="2026-04-17 16:30:16.145596283 +0000 UTC m=+646.519509203" lastFinishedPulling="2026-04-17 16:30:34.396542137 +0000 UTC m=+664.770455057" observedRunningTime="2026-04-17 16:30:35.234985666 +0000 UTC m=+665.608898607" watchObservedRunningTime="2026-04-17 16:30:35.236195342 +0000 UTC m=+665.610108284" Apr 17 16:30:44.250884 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:44.250812 2584 generic.go:358] "Generic (PLEG): container finished" podID="9aba66a5-0d65-420a-a453-0a383587784e" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" exitCode=6 Apr 17 16:30:44.251289 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:44.250885 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerDied","Data":"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857"} Apr 17 16:30:44.251289 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:44.250925 2584 scope.go:117] "RemoveContainer" containerID="955a3df1d7b3710d621ad13386402d8df76f35292e3d21c67f6ce8316ba3dba4" Apr 17 16:30:44.251289 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:44.251182 2584 scope.go:117] "RemoveContainer" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" Apr 17 16:30:44.251432 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:30:44.251414 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607390-7w2nz_opendatahub(9aba66a5-0d65-420a-a453-0a383587784e)\"" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" podUID="9aba66a5-0d65-420a-a453-0a383587784e" Apr 17 16:30:45.231372 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:45.231346 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-r7sfp" Apr 17 16:30:46.234409 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:46.234369 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5" Apr 17 16:30:58.185236 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:58.185195 2584 scope.go:117] "RemoveContainer" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" Apr 17 16:30:58.186281 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:58.186264 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:30:59.304627 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:30:59.304591 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerStarted","Data":"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835"} Apr 17 16:31:00.010305 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:00.010268 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:31:00.307582 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:00.307545 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" containerID="cri-o://7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835" gracePeriod=30 Apr 17 16:31:19.253067 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.253044 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:31:19.296126 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.296094 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5l8\" (UniqueName: \"kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8\") pod \"9aba66a5-0d65-420a-a453-0a383587784e\" (UID: \"9aba66a5-0d65-420a-a453-0a383587784e\") " Apr 17 16:31:19.298418 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.298388 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8" (OuterVolumeSpecName: "kube-api-access-qq5l8") pod "9aba66a5-0d65-420a-a453-0a383587784e" (UID: "9aba66a5-0d65-420a-a453-0a383587784e"). InnerVolumeSpecName "kube-api-access-qq5l8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:31:19.366980 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.366913 2584 generic.go:358] "Generic (PLEG): container finished" podID="9aba66a5-0d65-420a-a453-0a383587784e" containerID="7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835" exitCode=6 Apr 17 16:31:19.366980 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.366957 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerDied","Data":"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835"} Apr 17 16:31:19.366980 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.366979 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" event={"ID":"9aba66a5-0d65-420a-a453-0a383587784e","Type":"ContainerDied","Data":"f727fb7b365d92737dfa32b492646c630e68a5dca096a83f8a70ba5aaa7125f8"} Apr 17 16:31:19.367150 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.366985 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607390-7w2nz" Apr 17 16:31:19.367150 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.366994 2584 scope.go:117] "RemoveContainer" containerID="7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835" Apr 17 16:31:19.375123 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.375108 2584 scope.go:117] "RemoveContainer" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" Apr 17 16:31:19.381982 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.381965 2584 scope.go:117] "RemoveContainer" containerID="7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835" Apr 17 16:31:19.382235 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:31:19.382217 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835\": container with ID starting with 7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835 not found: ID does not exist" containerID="7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835" Apr 17 16:31:19.382279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.382242 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835"} err="failed to get container status \"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835\": rpc error: code = NotFound desc = could not find container \"7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835\": container with ID starting with 7431248f46b6b1bad6734c56d9ba27fd258c137c737de6770884c00a2aef4835 not found: ID does not exist" Apr 17 16:31:19.382279 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.382260 2584 scope.go:117] "RemoveContainer" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" Apr 17 16:31:19.382485 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:31:19.382470 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857\": container with ID starting with 2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857 not found: ID does not exist" containerID="2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857" Apr 17 16:31:19.382586 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.382491 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857"} err="failed to get container status \"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857\": rpc error: code = NotFound desc = could not find container \"2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857\": container with ID starting with 2e741793392068c640d114dc6f8244bbfeda4122ce9612fb4e6a1ad56eacb857 not found: ID does not exist" Apr 17 16:31:19.386429 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.386409 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:31:19.390218 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.390197 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607390-7w2nz"] Apr 17 16:31:19.396976 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:19.396955 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qq5l8\" (UniqueName: \"kubernetes.io/projected/9aba66a5-0d65-420a-a453-0a383587784e-kube-api-access-qq5l8\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:31:20.189966 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:31:20.189924 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aba66a5-0d65-420a-a453-0a383587784e" path="/var/lib/kubelet/pods/9aba66a5-0d65-420a-a453-0a383587784e/volumes" Apr 17 16:45:00.137083 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.136993 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137293 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137304 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137324 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137329 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137381 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.137583 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.137390 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:45:00.140156 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.140140 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:45:00.142549 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.142527 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ps47p\"" Apr 17 16:45:00.154054 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.154023 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:45:00.221160 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.221130 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfvr\" (UniqueName: \"kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr\") pod \"maas-api-key-cleanup-29607405-2k4nz\" (UID: \"1f367522-0eaa-480b-a25b-4f8daac1d877\") " pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:45:00.321648 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.321617 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfvr\" (UniqueName: \"kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr\") pod \"maas-api-key-cleanup-29607405-2k4nz\" (UID: \"1f367522-0eaa-480b-a25b-4f8daac1d877\") " pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:45:00.331791 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.331765 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfvr\" (UniqueName: \"kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr\") pod \"maas-api-key-cleanup-29607405-2k4nz\" (UID: \"1f367522-0eaa-480b-a25b-4f8daac1d877\") " pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:45:00.449985 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.449926 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:45:00.569427 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.569353 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:45:00.571570 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:45:00.571542 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f367522_0eaa_480b_a25b_4f8daac1d877.slice/crio-79614ac08a45d0085a3fdc2cf26f9756d734545d578d1a77cfae576299e9a786 WatchSource:0}: Error finding container 79614ac08a45d0085a3fdc2cf26f9756d734545d578d1a77cfae576299e9a786: Status 404 returned error can't find the container with id 79614ac08a45d0085a3fdc2cf26f9756d734545d578d1a77cfae576299e9a786 Apr 17 16:45:00.573383 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.573368 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:45:00.987245 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.987214 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerStarted","Data":"0dbb62ec1a9cb1f93e2a956ac485c839899510e6ceb675e14feb08f8295e5180"} Apr 17 16:45:00.987245 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:00.987248 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerStarted","Data":"79614ac08a45d0085a3fdc2cf26f9756d734545d578d1a77cfae576299e9a786"} Apr 17 16:45:01.002787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:01.002727 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" podStartSLOduration=1.002710281 podStartE2EDuration="1.002710281s" podCreationTimestamp="2026-04-17 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:45:01.000370539 +0000 UTC m=+1531.374283501" watchObservedRunningTime="2026-04-17 16:45:01.002710281 +0000 UTC m=+1531.376623224" Apr 17 16:45:22.054123 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:22.054089 2584 generic.go:358] "Generic (PLEG): container finished" podID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerID="0dbb62ec1a9cb1f93e2a956ac485c839899510e6ceb675e14feb08f8295e5180" exitCode=6 Apr 17 16:45:22.054463 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:22.054163 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerDied","Data":"0dbb62ec1a9cb1f93e2a956ac485c839899510e6ceb675e14feb08f8295e5180"} Apr 17 16:45:22.054563 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:22.054475 2584 scope.go:117] "RemoveContainer" containerID="0dbb62ec1a9cb1f93e2a956ac485c839899510e6ceb675e14feb08f8295e5180" Apr 17 16:45:23.062238 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:23.062197 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerStarted","Data":"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7"} Apr 17 16:45:43.126577 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:43.126543 2584 generic.go:358] "Generic (PLEG): container finished" podID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" exitCode=6 Apr 17 16:45:43.126996 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:43.126618 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerDied","Data":"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7"} Apr 17 16:45:43.126996 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:43.126666 2584 scope.go:117] "RemoveContainer" containerID="0dbb62ec1a9cb1f93e2a956ac485c839899510e6ceb675e14feb08f8295e5180" Apr 17 16:45:43.127087 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:43.127012 2584 scope.go:117] "RemoveContainer" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" Apr 17 16:45:43.127246 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:45:43.127223 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607405-2k4nz_opendatahub(1f367522-0eaa-480b-a25b-4f8daac1d877)\"" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" Apr 17 16:45:56.185412 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:56.185378 2584 scope.go:117] "RemoveContainer" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" Apr 17 16:45:57.174251 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:57.174213 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerStarted","Data":"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693"} Apr 17 16:45:57.211040 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:57.211014 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:45:58.177247 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:45:58.177207 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" containerID="cri-o://128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693" gracePeriod=30 Apr 17 16:46:17.016385 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.016364 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:46:17.101182 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.101153 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sfvr\" (UniqueName: \"kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr\") pod \"1f367522-0eaa-480b-a25b-4f8daac1d877\" (UID: \"1f367522-0eaa-480b-a25b-4f8daac1d877\") " Apr 17 16:46:17.103210 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.103190 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr" (OuterVolumeSpecName: "kube-api-access-4sfvr") pod "1f367522-0eaa-480b-a25b-4f8daac1d877" (UID: "1f367522-0eaa-480b-a25b-4f8daac1d877"). InnerVolumeSpecName "kube-api-access-4sfvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:46:17.201903 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.201845 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4sfvr\" (UniqueName: \"kubernetes.io/projected/1f367522-0eaa-480b-a25b-4f8daac1d877-kube-api-access-4sfvr\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 17 16:46:17.235754 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.235722 2584 generic.go:358] "Generic (PLEG): container finished" podID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerID="128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693" exitCode=6 Apr 17 16:46:17.235843 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.235765 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" Apr 17 16:46:17.235843 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.235774 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerDied","Data":"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693"} Apr 17 16:46:17.235843 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.235796 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607405-2k4nz" event={"ID":"1f367522-0eaa-480b-a25b-4f8daac1d877","Type":"ContainerDied","Data":"79614ac08a45d0085a3fdc2cf26f9756d734545d578d1a77cfae576299e9a786"} Apr 17 16:46:17.235843 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.235811 2584 scope.go:117] "RemoveContainer" containerID="128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693" Apr 17 16:46:17.244195 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.244179 2584 scope.go:117] "RemoveContainer" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" Apr 17 16:46:17.251139 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.251123 2584 scope.go:117] "RemoveContainer" containerID="128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693" Apr 17 16:46:17.251357 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:46:17.251336 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693\": container with ID starting with 128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693 not found: ID does not exist" containerID="128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693" Apr 17 16:46:17.251413 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.251365 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693"} err="failed to get container status \"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693\": rpc error: code = NotFound desc = could not find container \"128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693\": container with ID starting with 128347ea389296ad5dfe307d5049d6c37de9b6d68922edd08f9ce61b969d4693 not found: ID does not exist" Apr 17 16:46:17.251413 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.251381 2584 scope.go:117] "RemoveContainer" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" Apr 17 16:46:17.251641 ip-10-0-134-77 kubenswrapper[2584]: E0417 16:46:17.251626 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7\": container with ID starting with ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7 not found: ID does not exist" containerID="ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7" Apr 17 16:46:17.251685 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.251646 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7"} err="failed to get container status \"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7\": rpc error: code = NotFound desc = could not find container \"ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7\": container with ID starting with ae5695e7637340615fc9dd6d6f0537820eb067b95a0f5425d52e36e96f5c1cb7 not found: ID does not exist" Apr 17 16:46:17.257613 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.257594 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:46:17.261712 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:17.261691 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607405-2k4nz"] Apr 17 16:46:18.189703 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:46:18.189673 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" path="/var/lib/kubelet/pods/1f367522-0eaa-480b-a25b-4f8daac1d877/volumes" Apr 17 16:49:18.061447 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:18.061366 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zwdl2_19b8ddd2-b0ff-41cd-8200-02fafd91a837/manager/0.log" Apr 17 16:49:18.434788 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:18.434712 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6wdhg_32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f/manager/2.log" Apr 17 16:49:18.560293 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:18.560267 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-9dnxt_3a6a74e3-cc67-487a-9daa-af05e0bf63bd/manager/0.log" Apr 17 16:49:20.208552 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:20.208519 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-6h9gv_769b3620-4b0f-499b-b643-454c071e007d/manager/0.log" Apr 17 16:49:20.314111 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:20.314083 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-bffnp_a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed/manager/0.log" Apr 17 16:49:20.885268 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:20.885239 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9pzjg_e10ece49-2669-4bfc-9c02-3836ad5abb9f/manager/0.log" Apr 17 16:49:21.319408 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:21.319387 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xmd2v_418141b9-bb94-4b5f-a9bc-ec3afc664f02/discovery/0.log" Apr 17 16:49:21.425576 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:21.425551 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-667bf5bb7-p2bws_93601d38-a19b-45fc-afc9-2d7991a1db51/kube-auth-proxy/0.log" Apr 17 16:49:22.192481 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:22.192451 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-r7sfp_49a52107-4f8c-4824-af45-04d8868e7e5d/storage-initializer/0.log" Apr 17 16:49:22.198787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:22.198766 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-r7sfp_49a52107-4f8c-4824-af45-04d8868e7e5d/main/0.log" Apr 17 16:49:22.550678 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:22.550654 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5_6fd14f93-c028-4746-87f3-c785e8b01b3c/storage-initializer/0.log" Apr 17 16:49:22.559234 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:22.559214 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-x4zc5_6fd14f93-c028-4746-87f3-c785e8b01b3c/main/0.log" Apr 17 16:49:29.767285 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:29.767253 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qhfkp_bace55ce-7fc7-4b76-82f8-0f8250ee98a7/global-pull-secret-syncer/0.log" Apr 17 16:49:29.838657 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:29.838630 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v59gz_ebb991a9-7c10-423a-8330-afafc79edd8c/konnectivity-agent/0.log" Apr 17 16:49:29.913743 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:29.913717 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-77.ec2.internal_454bf7e88903cb3fed5cc9e7d8cf5d0d/haproxy/0.log" Apr 17 16:49:34.081053 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:34.081009 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-6h9gv_769b3620-4b0f-499b-b643-454c071e007d/manager/0.log" Apr 17 16:49:34.107480 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:34.107454 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-bffnp_a6614bb7-4d04-4a5b-b0ff-6af39b78c2ed/manager/0.log" Apr 17 16:49:34.356816 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:34.356741 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9pzjg_e10ece49-2669-4bfc-9c02-3836ad5abb9f/manager/0.log" Apr 17 16:49:36.015001 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:36.014915 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n489l_d4147357-1eb5-4032-b42c-5cc65a071498/node-exporter/0.log" Apr 17 16:49:36.029940 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:36.029910 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n489l_d4147357-1eb5-4032-b42c-5cc65a071498/kube-rbac-proxy/0.log" Apr 17 16:49:36.048440 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:36.048424 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n489l_d4147357-1eb5-4032-b42c-5cc65a071498/init-textfile/0.log" Apr 17 16:49:38.469743 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.469713 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j"] Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470006 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470018 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470027 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470033 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470042 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470047 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470053 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470102 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470058 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470336 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470113 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.470336 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470122 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aba66a5-0d65-420a-a453-0a383587784e" containerName="cleanup" Apr 17 16:49:38.470336 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.470128 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f367522-0eaa-480b-a25b-4f8daac1d877" containerName="cleanup" Apr 17 16:49:38.473002 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.472985 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.475805 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.475780 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-98l5c\"/\"default-dockercfg-j7zpj\"" Apr 17 16:49:38.475805 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.475802 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"openshift-service-ca.crt\"" Apr 17 16:49:38.476719 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.476705 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"kube-root-ca.crt\"" Apr 17 16:49:38.480535 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.480212 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j"] Apr 17 16:49:38.514882 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.514860 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-podres\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.515011 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.514901 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-sys\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.515011 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.514937 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-proc\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.515011 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.514991 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-lib-modules\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.515149 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.515070 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkzj\" (UniqueName: \"kubernetes.io/projected/4b4a9711-4ab6-4cea-b926-b6553c2438ca-kube-api-access-fmkzj\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616253 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616230 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-proc\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616368 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616265 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-lib-modules\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616368 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616295 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkzj\" (UniqueName: \"kubernetes.io/projected/4b4a9711-4ab6-4cea-b926-b6553c2438ca-kube-api-access-fmkzj\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616368 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616346 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-proc\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616556 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616383 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-podres\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616556 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616411 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-sys\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616556 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616433 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-lib-modules\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616556 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616470 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-sys\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.616692 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.616556 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b4a9711-4ab6-4cea-b926-b6553c2438ca-podres\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.624409 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.624381 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkzj\" (UniqueName: \"kubernetes.io/projected/4b4a9711-4ab6-4cea-b926-b6553c2438ca-kube-api-access-fmkzj\") pod \"perf-node-gather-daemonset-5dt8j\" (UID: \"4b4a9711-4ab6-4cea-b926-b6553c2438ca\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.784747 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.784718 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:38.903693 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:38.903630 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j"] Apr 17 16:49:38.906272 ip-10-0-134-77 kubenswrapper[2584]: W0417 16:49:38.906247 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b4a9711_4ab6_4cea_b926_b6553c2438ca.slice/crio-d64296ee8e1d8ef98404350b7b575af30464674e0695e57a1940b6e83d108ea0 WatchSource:0}: Error finding container d64296ee8e1d8ef98404350b7b575af30464674e0695e57a1940b6e83d108ea0: Status 404 returned error can't find the container with id d64296ee8e1d8ef98404350b7b575af30464674e0695e57a1940b6e83d108ea0 Apr 17 16:49:39.879787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:39.879749 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" event={"ID":"4b4a9711-4ab6-4cea-b926-b6553c2438ca","Type":"ContainerStarted","Data":"e5a0cb62cb62511ac0b1e823c81d57e05a4d3f0e64fd9133c58dc20ee5a79980"} Apr 17 16:49:39.879787 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:39.879790 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" event={"ID":"4b4a9711-4ab6-4cea-b926-b6553c2438ca","Type":"ContainerStarted","Data":"d64296ee8e1d8ef98404350b7b575af30464674e0695e57a1940b6e83d108ea0"} Apr 17 16:49:39.880213 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:39.879882 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:39.897012 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:39.896972 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" podStartSLOduration=1.89696109 podStartE2EDuration="1.89696109s" podCreationTimestamp="2026-04-17 16:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:39.89564661 +0000 UTC m=+1810.269559553" watchObservedRunningTime="2026-04-17 16:49:39.89696109 +0000 UTC m=+1810.270874033" Apr 17 16:49:40.327227 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:40.327185 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ddhcx_934467f3-270b-4b90-b4e8-331914b57c8d/dns/0.log" Apr 17 16:49:40.346933 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:40.346909 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ddhcx_934467f3-270b-4b90-b4e8-331914b57c8d/kube-rbac-proxy/0.log" Apr 17 16:49:40.488366 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:40.488339 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-976rk_b2b1e0e2-9eb2-4e4f-9027-c81e854a984c/dns-node-resolver/0.log" Apr 17 16:49:40.999900 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:40.999870 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cdxj4_4dd52096-bca5-4442-852e-5f41d1bb9827/node-ca/0.log" Apr 17 16:49:41.967386 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:41.967360 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xmd2v_418141b9-bb94-4b5f-a9bc-ec3afc664f02/discovery/0.log" Apr 17 16:49:41.985796 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:41.985776 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-667bf5bb7-p2bws_93601d38-a19b-45fc-afc9-2d7991a1db51/kube-auth-proxy/0.log" Apr 17 16:49:42.568644 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:42.568591 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5sm7s_93428031-c4fe-4f1f-a088-b038195cf17e/serve-healthcheck-canary/0.log" Apr 17 16:49:43.063825 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:43.063800 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dbmm_2f0b1797-1af9-4631-be74-97e2db42f8ec/kube-rbac-proxy/0.log" Apr 17 16:49:43.086098 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:43.086065 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dbmm_2f0b1797-1af9-4631-be74-97e2db42f8ec/exporter/0.log" Apr 17 16:49:43.108908 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:43.108884 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dbmm_2f0b1797-1af9-4631-be74-97e2db42f8ec/extractor/0.log" Apr 17 16:49:45.113649 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:45.113598 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zwdl2_19b8ddd2-b0ff-41cd-8200-02fafd91a837/manager/0.log" Apr 17 16:49:45.212952 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:45.212913 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6wdhg_32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f/manager/1.log" Apr 17 16:49:45.223294 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:45.223271 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6wdhg_32683b8e-e0e0-4ffd-a2a6-473e0e8fb37f/manager/2.log" Apr 17 16:49:45.259106 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:45.259081 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-9dnxt_3a6a74e3-cc67-487a-9daa-af05e0bf63bd/manager/0.log" Apr 17 16:49:45.893534 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:45.893511 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-5dt8j" Apr 17 16:49:52.357760 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.357725 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/kube-multus-additional-cni-plugins/0.log" Apr 17 16:49:52.383231 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.383204 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/egress-router-binary-copy/0.log" Apr 17 16:49:52.410091 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.410068 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/cni-plugins/0.log" Apr 17 16:49:52.440109 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.440082 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/bond-cni-plugin/0.log" Apr 17 16:49:52.459036 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.459000 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/routeoverride-cni/0.log" Apr 17 16:49:52.479963 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.479948 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/whereabouts-cni-bincopy/0.log" Apr 17 16:49:52.502398 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.502378 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6896_c20048a9-0ed2-477d-9b33-25dc727aeda5/whereabouts-cni/0.log" Apr 17 16:49:52.891645 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.891615 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s848g_7ae29549-3750-4378-9d33-2e6bfdb368b5/kube-multus/0.log" Apr 17 16:49:52.910796 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.910769 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cgrms_dcbdb9dc-df9a-4c0b-850e-370061051a08/network-metrics-daemon/0.log" Apr 17 16:49:52.929614 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:52.929595 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cgrms_dcbdb9dc-df9a-4c0b-850e-370061051a08/kube-rbac-proxy/0.log" Apr 17 16:49:53.813688 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.813599 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/ovn-controller/0.log" Apr 17 16:49:53.838637 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.838611 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/ovn-acl-logging/0.log" Apr 17 16:49:53.859308 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.859284 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/kube-rbac-proxy-node/0.log" Apr 17 16:49:53.879921 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.879894 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:49:53.897177 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.897154 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/northd/0.log" Apr 17 16:49:53.915901 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.915880 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/nbdb/0.log" Apr 17 16:49:53.935687 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:53.935667 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/sbdb/0.log" Apr 17 16:49:54.024371 ip-10-0-134-77 kubenswrapper[2584]: I0417 16:49:54.024345 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qvmb_e12c6987-366e-4f26-ae6c-75cc6a5d3967/ovnkube-controller/0.log"