Apr 20 14:53:30.066639 ip-10-0-142-255 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:53:30.490322 ip-10-0-142-255 kubenswrapper[2538]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:30.490322 ip-10-0-142-255 kubenswrapper[2538]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:53:30.490322 ip-10-0-142-255 kubenswrapper[2538]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:30.490322 ip-10-0-142-255 kubenswrapper[2538]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:53:30.490322 ip-10-0-142-255 kubenswrapper[2538]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:30.493361 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.493197 2538 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:53:30.497121 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497105 2538 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:30.497121 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497121 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497125 2538 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497129 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497132 2538 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497135 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497139 2538 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497142 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497146 2538 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497150 2538 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497160 2538 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497163 2538 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497166 2538 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497169 2538 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497172 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497175 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497178 2538 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497181 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497184 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497186 2538 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497189 2538 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:30.497186 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497192 2538 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497195 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497198 2538 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497201 2538 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497203 2538 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497206 2538 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497209 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497212 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497214 2538 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497217 2538 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497219 2538 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497221 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497224 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497226 2538 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497236 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497240 2538 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497243 2538 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497245 2538 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497248 2538 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497251 2538 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:30.497674 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497255 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497258 2538 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497260 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497263 2538 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497265 2538 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497273 2538 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497276 2538 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497278 2538 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497281 2538 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497284 2538 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497286 2538 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497289 2538 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497291 2538 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497294 2538 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497298 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497300 2538 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497303 2538 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497306 2538 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497308 2538 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497311 2538 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:30.498196 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497314 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497316 2538 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497321 2538 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497325 2538 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497328 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497330 2538 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497333 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497336 2538 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497351 2538 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497354 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497356 2538 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497359 2538 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497361 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497365 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497367 2538 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497370 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497372 2538 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497375 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497377 2538 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:30.498691 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497381 2538 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497383 2538 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497386 2538 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497388 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497391 2538 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.497393 2538 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498325 2538 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498331 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498336 2538 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498351 2538 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498355 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498358 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498361 2538 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498364 2538 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498367 2538 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498370 2538 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498373 2538 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498375 2538 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498378 2538 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498380 2538 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:30.499154 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498383 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498386 2538 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498388 2538 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498391 2538 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498393 2538 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498396 2538 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498399 2538 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498401 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498404 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498406 2538 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498409 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498411 2538 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498414 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498417 2538 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498420 2538 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498422 2538 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498425 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498427 2538 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498430 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498432 2538 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:30.499656 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498435 2538 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498437 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498440 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498442 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498445 2538 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498448 2538 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498450 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498453 2538 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498456 2538 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498458 2538 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498461 2538 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498463 2538 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498466 2538 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498469 2538 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498471 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498474 2538 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498476 2538 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498479 2538 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498482 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:30.500150 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498485 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498487 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498490 2538 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498492 2538 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498495 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498498 2538 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498501 2538 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498503 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498506 2538 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498508 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498510 2538 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498513 2538 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498515 2538 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498518 2538 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498520 2538 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498523 2538 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498526 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498528 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498531 2538 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498533 2538 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:30.500633 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498536 2538 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498539 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498541 2538 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498544 2538 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498548 2538 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498551 2538 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498553 2538 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498556 2538 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498558 2538 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498560 2538 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498563 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498566 2538 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.498570 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498644 2538 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498651 2538 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498658 2538 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498662 2538 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498673 2538 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498677 2538 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498682 2538 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498687 2538 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:53:30.501137 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498690 2538 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498693 2538 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498697 2538 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498700 2538 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498704 2538 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498707 2538 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498710 2538 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498713 2538 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498715 2538 flags.go:64] FLAG: --cloud-config="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498718 2538 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498722 2538 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498726 2538 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498729 2538 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498732 2538 flags.go:64] FLAG: --config-dir="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498735 2538 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498738 2538 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498742 2538 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498745 2538 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498748 2538 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498752 2538 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498755 2538 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498758 2538 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498761 2538 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498764 2538 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498767 2538 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:53:30.501675 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498771 2538 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498774 2538 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498777 2538 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498780 2538 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498783 2538 flags.go:64] FLAG: --enable-server="true" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498786 2538 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498790 2538 flags.go:64] FLAG: --event-burst="100" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498793 2538 flags.go:64] FLAG: --event-qps="50" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498799 2538 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498802 2538 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498806 2538 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498809 2538 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498812 2538 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498815 2538 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498818 2538 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498821 2538 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498824 2538 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498827 2538 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498830 2538 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498833 2538 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498836 2538 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498838 2538 flags.go:64] FLAG: --feature-gates="" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498842 2538 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498845 2538 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498848 2538 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:53:30.502303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498851 2538 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498854 2538 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498857 2538 flags.go:64] FLAG: --help="false" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498860 2538 flags.go:64] FLAG: --hostname-override="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498863 2538 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498866 2538 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498869 2538 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498872 2538 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498876 2538 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498879 2538 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498881 2538 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498885 2538 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498887 2538 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498890 2538 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498893 2538 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498897 2538 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498900 2538 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498903 2538 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498906 2538 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498922 2538 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498926 2538 flags.go:64] FLAG: --lock-file="" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498930 2538 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498933 2538 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498936 2538 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:53:30.502937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498942 2538 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498945 2538 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498948 2538 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498951 2538 flags.go:64] FLAG: --logging-format="text" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498953 2538 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498956 2538 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498959 2538 flags.go:64] FLAG: --manifest-url="" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498962 2538 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498966 2538 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498969 2538 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498974 2538 flags.go:64] FLAG: --max-pods="110" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498977 2538 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498980 2538 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498983 2538 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498986 2538 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498989 2538 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498991 2538 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.498995 2538 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499002 2538 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499005 2538 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499008 2538 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499011 2538 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499014 2538 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:53:30.503559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499019 2538 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499023 2538 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499026 2538 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499029 2538 flags.go:64] FLAG: --port="10250" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499032 2538 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499034 2538 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fb561ee0a60ae5e2" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499038 2538 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499041 2538 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499044 2538 flags.go:64] FLAG: --register-node="true" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499046 2538 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499049 2538 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499053 2538 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499056 2538 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499058 2538 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499061 2538 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499065 2538 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499068 2538 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499071 2538 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499073 2538 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499076 2538 flags.go:64] FLAG: --runonce="false" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499079 2538 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499082 2538 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499085 2538 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499088 2538 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499091 2538 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499094 2538 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:53:30.504120 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499096 2538 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499100 2538 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499103 2538 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499106 2538 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499108 2538 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499111 2538 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499114 2538 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499118 2538 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499121 2538 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499126 2538 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499129 2538 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499132 2538 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499136 2538 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499139 2538 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499142 2538 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499144 2538 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499147 2538 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499152 2538 flags.go:64] FLAG: --v="2" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499157 2538 flags.go:64] FLAG: --version="false" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499161 2538 flags.go:64] FLAG: --vmodule="" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499170 2538 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.499173 2538 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499258 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499262 2538 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:30.504767 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499266 2538 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499269 2538 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499271 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499274 2538 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499276 2538 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499279 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499281 2538 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499284 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499287 2538 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499289 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499292 2538 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499295 2538 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499298 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499300 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499303 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499307 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499309 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499312 2538 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499314 2538 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499317 2538 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:30.505455 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499319 2538 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499322 2538 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499324 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499327 2538 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499329 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499333 2538 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499335 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499338 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499353 2538 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499356 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499359 2538 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499362 2538 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499364 2538 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499367 2538 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499369 2538 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499372 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499374 2538 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499377 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499379 2538 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:30.505971 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499382 2538 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499385 2538 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499387 2538 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499390 2538 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499393 2538 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499395 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499398 2538 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499401 2538 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499405 2538 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499407 2538 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499410 2538 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499412 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499415 2538 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499417 2538 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499420 2538 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499422 2538 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499425 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499428 2538 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499432 2538 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499434 2538 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:30.506468 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499437 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499440 2538 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499444 2538 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499448 2538 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499451 2538 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499454 2538 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499457 2538 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499460 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499462 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499465 2538 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499467 2538 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499470 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499473 2538 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499476 2538 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499480 2538 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499483 2538 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499486 2538 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499490 2538 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499492 2538 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:30.507200 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499495 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499499 2538 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499501 2538 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499504 2538 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499507 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.499509 2538 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:30.507744 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.500207 2538 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:30.507987 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.507971 2538 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:53:30.508020 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.507989 2538 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:53:30.508052 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508038 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:30.508052 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508042 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:30.508052 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508046 2538 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:30.508052 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508049 2538 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:30.508052 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508052 2538 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508055 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508058 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508061 2538 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508064 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508067 2538 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508070 2538 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508073 2538 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508076 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508079 2538 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508082 2538 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508085 2538 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508088 2538 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508091 2538 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508094 2538 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508097 2538 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508100 2538 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508102 2538 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508105 2538 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508108 2538 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:30.508172 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508110 2538 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508113 2538 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508116 2538 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508118 2538 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508121 2538 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508123 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508126 2538 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508128 2538 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508131 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508133 2538 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508136 2538 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508138 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508141 2538 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508143 2538 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508147 2538 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508149 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508152 2538 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508155 2538 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508158 2538 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508161 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:30.508757 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508163 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508167 2538 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508172 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508175 2538 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508178 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508181 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508184 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508187 2538 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508189 2538 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508193 2538 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508197 2538 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508200 2538 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508202 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508205 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508207 2538 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508210 2538 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508212 2538 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508215 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508217 2538 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:30.509286 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508220 2538 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508223 2538 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508225 2538 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508228 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508230 2538 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508233 2538 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508235 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508238 2538 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508241 2538 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508243 2538 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508246 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508249 2538 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508251 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508254 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508256 2538 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508259 2538 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508261 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508264 2538 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508266 2538 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508269 2538 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:30.509785 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508271 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508274 2538 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508276 2538 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.508282 2538 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508393 2538 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508399 2538 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508402 2538 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508405 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508408 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508411 2538 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508414 2538 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508416 2538 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508419 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508421 2538 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508424 2538 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:30.510275 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508426 2538 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508430 2538 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508434 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508437 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508440 2538 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508442 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508445 2538 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508448 2538 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508451 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508454 2538 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508456 2538 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508459 2538 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508462 2538 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508464 2538 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508467 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508469 2538 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508472 2538 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508475 2538 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508477 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508479 2538 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:30.510663 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508482 2538 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508484 2538 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508487 2538 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508489 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508492 2538 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508495 2538 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508497 2538 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508501 2538 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508504 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508507 2538 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508510 2538 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508512 2538 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508515 2538 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508517 2538 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508520 2538 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508522 2538 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508525 2538 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508528 2538 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508531 2538 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508533 2538 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:30.511157 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508536 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508538 2538 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508540 2538 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508543 2538 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508550 2538 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508553 2538 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508555 2538 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508558 2538 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508560 2538 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508562 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508565 2538 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508568 2538 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508570 2538 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508573 2538 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508575 2538 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508578 2538 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508580 2538 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508582 2538 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508585 2538 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508587 2538 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:30.511668 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508590 2538 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508593 2538 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508596 2538 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508598 2538 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508601 2538 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508603 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508606 2538 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508609 2538 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508611 2538 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508614 2538 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508617 2538 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508619 2538 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508622 2538 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508624 2538 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:30.508627 2538 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:30.512169 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.508632 2538 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:30.512710 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.509310 2538 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:53:30.512710 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.511943 2538 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:53:30.512825 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.512761 2538 server.go:1019] "Starting client certificate rotation" Apr 20 14:53:30.512877 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.512858 2538 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:30.512910 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.512897 2538 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:30.534750 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.534732 2538 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:30.538256 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.538240 2538 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:30.553491 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.553473 2538 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:53:30.559303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.559285 2538 log.go:25] "Validated CRI v1 image API" Apr 20 14:53:30.560563 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.560549 2538 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:53:30.563868 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.563836 2538 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 addef07d-aca9-4057-857e-07081976db74:/dev/nvme0n1p3 b6fdfaae-3a2b-433f-9ada-41220d8b8562:/dev/nvme0n1p4] Apr 20 14:53:30.563965 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.563864 2538 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:53:30.566232 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.566212 2538 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:30.570436 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.570309 2538 manager.go:217] Machine: {Timestamp:2026-04-20 14:53:30.56869741 +0000 UTC m=+0.387322944 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3116537 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2567f156d01287bffcb5440ff13029 SystemUUID:ec2567f1-56d0-1287-bffc-b5440ff13029 BootID:9ce34fc1-e43a-40b7-a3d7-dfe29ea69bcf Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a2:32:a9:88:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a2:32:a9:88:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:73:d2:64:d6:c3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:53:30.570436 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.570429 2538 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:53:30.570557 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.570511 2538 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:53:30.574532 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.574506 2538 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:53:30.574666 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.574535 2538 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-255.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:53:30.574715 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.574675 2538 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:53:30.574715 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.574683 2538 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:53:30.574715 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.574696 2538 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:30.575260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.575250 2538 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:30.575919 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.575910 2538 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:30.576026 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.576017 2538 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:53:30.579085 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.579074 2538 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:53:30.579139 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.579090 2538 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:53:30.579139 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.579106 2538 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:53:30.579139 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.579117 2538 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:53:30.579139 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.579125 2538 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:53:30.580529 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.580517 2538 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:30.580571 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.580544 2538 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:30.583273 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.583253 2538 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:53:30.584685 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.584671 2538 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:53:30.585878 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.585861 2538 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-546mc" Apr 20 14:53:30.588186 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588174 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588191 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588198 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588205 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588211 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588217 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588223 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588228 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588234 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:53:30.588238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588240 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:53:30.588492 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588254 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:53:30.588492 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588264 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:53:30.588954 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588945 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:53:30.588984 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.588954 2538 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:53:30.592502 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592484 2538 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-255.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:53:30.592583 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.592504 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:53:30.592723 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592710 2538 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:53:30.592773 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592746 2538 server.go:1295] "Started kubelet" Apr 20 14:53:30.592773 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.592753 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:53:30.592871 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592827 2538 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:53:30.592937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592890 2538 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:53:30.592981 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.592968 2538 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:53:30.593614 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.593586 2538 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-546mc" Apr 20 14:53:30.593745 ip-10-0-142-255 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:53:30.594198 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.593931 2538 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:53:30.594935 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.594765 2538 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:53:30.601176 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.601155 2538 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:53:30.601273 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.601224 2538 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:30.601754 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.601740 2538 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:53:30.602366 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602337 2538 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:53:30.602366 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602368 2538 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:53:30.602509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602337 2538 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:53:30.602509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602448 2538 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:53:30.602509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602457 2538 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:53:30.602509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602447 2538 factory.go:55] Registering systemd factory Apr 20 14:53:30.602662 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602561 2538 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:53:30.602662 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.602568 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:30.602807 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602790 2538 factory.go:153] Registering CRI-O factory Apr 20 14:53:30.602807 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602805 2538 factory.go:223] Registration of the crio container factory successfully Apr 20 14:53:30.602960 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602864 2538 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:53:30.602960 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602883 2538 factory.go:103] Registering Raw factory Apr 20 14:53:30.602960 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.602894 2538 manager.go:1196] Started watching for new ooms in manager Apr 20 14:53:30.603303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.603290 2538 manager.go:319] Starting recovery of all containers Apr 20 14:53:30.608066 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.608042 2538 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:30.615164 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.615146 2538 manager.go:324] Recovery completed Apr 20 14:53:30.615164 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.615173 2538 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-255.ec2.internal\" not found" node="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.619753 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.619736 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.622943 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.622926 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.623019 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.622955 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.623019 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.622965 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.623526 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.623513 2538 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:53:30.623526 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.623524 2538 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:53:30.623632 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.623539 2538 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:30.625551 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.625539 2538 policy_none.go:49] "None policy: Start" Apr 20 14:53:30.625603 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.625554 2538 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:53:30.625603 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.625564 2538 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:53:30.663890 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.663874 2538 manager.go:341] "Starting Device Plugin manager" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.663918 2538 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.663931 2538 server.go:85] "Starting device plugin registration server" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.664201 2538 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.664216 2538 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.664290 2538 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.664387 2538 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.664397 2538 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.664913 2538 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:53:30.674679 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.664957 2538 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:30.699430 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.699406 2538 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:53:30.700693 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.700673 2538 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:53:30.700693 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.700696 2538 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:53:30.700821 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.700713 2538 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:53:30.700821 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.700720 2538 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:53:30.700821 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.700787 2538 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:53:30.704030 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.704012 2538 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:30.765078 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.765014 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.765890 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.765873 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.765980 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.765910 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.765980 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.765927 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.765980 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.765959 2538 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.776517 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.776502 2538 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.776588 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.776522 2538 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-255.ec2.internal\": node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:30.792769 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.792750 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:30.801465 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.801437 2538 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal"] Apr 20 14:53:30.801531 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.801509 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.802284 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.802269 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.802375 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.802293 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.802375 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.802302 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.804481 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.804468 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.804620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.804607 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.804659 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.804634 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.805157 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805129 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.805157 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805140 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.805157 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805159 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.805335 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805170 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.805335 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805159 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.805335 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.805256 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.807370 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.807358 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.807410 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.807382 2538 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:30.807993 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.807978 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:30.808068 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.808006 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:30.808068 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.808020 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:30.835843 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.835825 2538 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-255.ec2.internal\" not found" node="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.840080 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.840065 2538 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-255.ec2.internal\" not found" node="ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.893766 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.893747 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:30.904035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.904017 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.904092 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.904046 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.904092 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:30.904065 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:30.994605 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:30.994574 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.005172 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005149 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.005242 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005177 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.005242 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005195 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.005309 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005246 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.005309 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005258 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.005309 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.005266 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.095651 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.095619 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.138107 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.138081 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.142570 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.142551 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.196152 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.196120 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.296662 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.296639 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.397203 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.397125 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.497695 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.497661 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.514084 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.514058 2538 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:53:31.514229 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.514211 2538 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:53:31.514281 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.514235 2538 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:53:31.596442 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.596223 2538 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:48:30 +0000 UTC" deadline="2027-10-16 08:44:58.255916509 +0000 UTC" Apr 20 14:53:31.596442 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.596440 2538 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13049h51m26.659480678s" Apr 20 14:53:31.598414 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.598394 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.601888 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.601859 2538 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:31.626968 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.626947 2538 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:31.653436 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.653375 2538 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkf2v" Apr 20 14:53:31.663319 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.663302 2538 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkf2v" Apr 20 14:53:31.695105 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:31.695069 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode121b59458e83a52d173db12d00639e1.slice/crio-0e63679cacceb0d1386deaa79bc02b47a5b4bdc9ec56ca76f40c09b33b75d2a8 WatchSource:0}: Error finding container 0e63679cacceb0d1386deaa79bc02b47a5b4bdc9ec56ca76f40c09b33b75d2a8: Status 404 returned error can't find the container with id 0e63679cacceb0d1386deaa79bc02b47a5b4bdc9ec56ca76f40c09b33b75d2a8 Apr 20 14:53:31.695578 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:31.695554 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode688fec9147a531ae0f3ba981a4ec304.slice/crio-8fa3d1c7a1b6189dbae8fb3ca675244951675039fbaf0ecd7a347ac7207ad9ad WatchSource:0}: Error finding container 8fa3d1c7a1b6189dbae8fb3ca675244951675039fbaf0ecd7a347ac7207ad9ad: Status 404 returned error can't find the container with id 8fa3d1c7a1b6189dbae8fb3ca675244951675039fbaf0ecd7a347ac7207ad9ad Apr 20 14:53:31.698507 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:31.698489 2538 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 20 14:53:31.699902 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.699890 2538 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:53:31.703038 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.702999 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerStarted","Data":"0e63679cacceb0d1386deaa79bc02b47a5b4bdc9ec56ca76f40c09b33b75d2a8"} Apr 20 14:53:31.704045 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.704023 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" event={"ID":"e688fec9147a531ae0f3ba981a4ec304","Type":"ContainerStarted","Data":"8fa3d1c7a1b6189dbae8fb3ca675244951675039fbaf0ecd7a347ac7207ad9ad"} Apr 20 14:53:31.715535 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.715515 2538 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:31.801906 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.801883 2538 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.803545 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.803529 2538 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:31.817827 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.817806 2538 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:31.818594 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.818583 2538 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 20 14:53:31.830682 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:31.830668 2538 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:32.579786 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.579756 2538 apiserver.go:52] "Watching apiserver" Apr 20 14:53:32.588511 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.588489 2538 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:53:32.588867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.588845 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-777tg","kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal","openshift-cluster-node-tuning-operator/tuned-fz52j","openshift-image-registry/node-ca-xt2lb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal","openshift-multus/multus-additional-cni-plugins-fsr4z","openshift-network-diagnostics/network-check-target-rdpq8","openshift-network-operator/iptables-alerter-dwkwl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82","openshift-multus/multus-mw2xg","openshift-multus/network-metrics-daemon-fdmrj","openshift-ovn-kubernetes/ovnkube-node-q22pz"] Apr 20 14:53:32.591279 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.591252 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.593507 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.593470 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.594011 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.593990 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5267v\"" Apr 20 14:53:32.594102 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.593994 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:53:32.594102 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.593999 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:53:32.595693 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.595667 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.596147 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.595990 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.596147 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.596049 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gc8mh\"" Apr 20 14:53:32.596837 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.596814 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.597934 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.597905 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.598320 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.598300 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.600220 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.599679 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:53:32.600220 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.599711 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.600220 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.599939 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6rxz6\"" Apr 20 14:53:32.601770 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.601727 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.602243 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602039 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:32.602243 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602081 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:53:32.602243 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602109 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.602243 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.602118 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:32.602527 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602331 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vk54s\"" Apr 20 14:53:32.602527 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602482 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:53:32.602640 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.602622 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:53:32.604548 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.604529 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.607630 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.607608 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.607702 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.607675 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.607793 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.607777 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:53:32.608032 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.608015 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7pr4z\"" Apr 20 14:53:32.609080 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.609059 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.609159 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.609115 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.611467 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.611210 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.611467 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.611273 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:32.611855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.611710 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.611855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.611768 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gwml9\"" Apr 20 14:53:32.612138 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612122 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:53:32.612138 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612125 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h95c4\"" Apr 20 14:53:32.612263 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612143 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:53:32.612263 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612151 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.612554 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612534 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-sys\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.612655 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612571 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.612655 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612602 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-serviceca\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.612655 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612629 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnvqb\" (UniqueName: \"kubernetes.io/projected/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-kube-api-access-qnvqb\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.612655 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612652 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-os-release\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612685 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75mc\" (UniqueName: \"kubernetes.io/projected/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-kube-api-access-c75mc\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612712 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e72864c-5e7b-4f27-a09b-1c22c0833c25-host-slash\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612736 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-systemd\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612766 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-lib-modules\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612790 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjq4\" (UniqueName: \"kubernetes.io/projected/5e72864c-5e7b-4f27-a09b-1c22c0833c25-kube-api-access-lcjq4\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612814 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.612855 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612838 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612863 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-modprobe-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612881 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-tuned\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612894 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-tmp\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612917 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612954 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-conf\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612968 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-run\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.612984 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-host\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613009 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-system-cni-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613033 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cnibin\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613054 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613071 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-konnectivity-ca\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613108 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613121 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e72864c-5e7b-4f27-a09b-1c22c0833c25-iptables-alerter-script\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613163 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6t8\" (UniqueName: \"kubernetes.io/projected/665d6c9a-125a-4b06-a1a5-d6b7f642e117-kube-api-access-8t6t8\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613180 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-agent-certs\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.613884 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613197 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysconfig\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613884 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613249 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-kubernetes\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613884 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613301 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-var-lib-kubelet\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613884 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613322 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-host\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.613884 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.613705 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.616241 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.616227 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:53:32.616334 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.616242 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:53:32.617802 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.617783 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:53:32.617885 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.617800 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:53:32.617885 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.617825 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:53:32.617885 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.617847 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:53:32.618026 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.617951 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-f2g4l\"" Apr 20 14:53:32.664571 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.664533 2538 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:31 +0000 UTC" deadline="2028-01-19 09:19:56.64477849 +0000 UTC" Apr 20 14:53:32.664571 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.664562 2538 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15330h26m23.980221629s" Apr 20 14:53:32.705958 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.705925 2538 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:53:32.713832 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713804 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c75mc\" (UniqueName: \"kubernetes.io/projected/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-kube-api-access-c75mc\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713848 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-os-release\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713878 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjq4\" (UniqueName: \"kubernetes.io/projected/5e72864c-5e7b-4f27-a09b-1c22c0833c25-kube-api-access-lcjq4\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713902 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-registration-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713927 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-bin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713946 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-kubelet\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.713961 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713960 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-ovn\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713977 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.713999 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-tuned\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714023 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-tmp\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714049 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-device-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714072 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-conf-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714093 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-socket-dir-parent\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714117 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714141 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-run\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714162 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-host\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714185 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cnibin\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714211 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.714245 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714242 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-system-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714266 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-netns\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714299 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-netd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714325 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-konnectivity-ca\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714402 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-run\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714522 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e72864c-5e7b-4f27-a09b-1c22c0833c25-iptables-alerter-script\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714557 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714585 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-k8s-cni-cncf-io\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.714812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714601 2538 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714928 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-konnectivity-ca\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714960 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715025 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-host\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715026 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cnibin\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.714610 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgpr\" (UniqueName: \"kubernetes.io/projected/54adba6d-382e-43b7-9219-644ce4ea5f46-kube-api-access-9xgpr\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715108 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-slash\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715128 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e72864c-5e7b-4f27-a09b-1c22c0833c25-iptables-alerter-script\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715141 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6t8\" (UniqueName: \"kubernetes.io/projected/665d6c9a-125a-4b06-a1a5-d6b7f642e117-kube-api-access-8t6t8\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715174 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-agent-certs\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715192 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysconfig\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715198 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715208 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-kubernetes\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715223 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-var-lib-kubelet\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715250 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9flq\" (UniqueName: \"kubernetes.io/projected/cf488808-137d-4895-9f36-c87fdbd47441-kube-api-access-w9flq\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715278 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-kubernetes\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715279 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-netns\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715303 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-systemd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715314 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-var-lib-kubelet\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715334 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-sys\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715379 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715395 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-sys\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715356 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysconfig\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715406 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-hostroot\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715434 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-var-lib-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715458 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.715500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715481 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715507 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnvqb\" (UniqueName: \"kubernetes.io/projected/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-kube-api-access-qnvqb\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715532 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-os-release\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715560 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e72864c-5e7b-4f27-a09b-1c22c0833c25-host-slash\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715584 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-etc-selinux\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715608 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-multus\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715631 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-etc-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715656 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-systemd\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715688 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-lib-modules\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715713 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-daemon-config\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715737 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-script-lib\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715762 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715780 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715851 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-os-release\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715893 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-modprobe-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715898 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-systemd\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715785 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-modprobe-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.716277 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715937 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovn-node-metrics-cert\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715944 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-lib-modules\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715974 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-cni-binary-copy\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.715945 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e72864c-5e7b-4f27-a09b-1c22c0833c25-host-slash\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716003 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-kubelet\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716083 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh2tx\" (UniqueName: \"kubernetes.io/projected/5752b3ca-4688-4db8-9995-af78bc6f30d3-kube-api-access-fh2tx\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716131 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716166 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-conf\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716201 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-system-cni-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716229 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-socket-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716248 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-d\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716271 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-system-cni-dir\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716305 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-sys-fs\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716369 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-cnibin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716382 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-sysctl-conf\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716396 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-bin\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716443 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.717071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716444 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716487 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716512 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716535 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-log-socket\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716567 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-host\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716611 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/665d6c9a-125a-4b06-a1a5-d6b7f642e117-host\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716646 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-multus-certs\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716678 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-systemd-units\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716704 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-config\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716750 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-env-overrides\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716779 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l857h\" (UniqueName: \"kubernetes.io/projected/5e16698f-ad67-45e2-8b90-cd0a144a2469-kube-api-access-l857h\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716813 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-etc-kubernetes\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716832 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-node-log\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.716857 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-serviceca\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.717867 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.717283 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-serviceca\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.718845 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.718805 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d-agent-certs\") pod \"konnectivity-agent-777tg\" (UID: \"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d\") " pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.719097 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.719078 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-etc-tuned\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.719180 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.719120 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/665d6c9a-125a-4b06-a1a5-d6b7f642e117-tmp\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.726042 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.726020 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjq4\" (UniqueName: \"kubernetes.io/projected/5e72864c-5e7b-4f27-a09b-1c22c0833c25-kube-api-access-lcjq4\") pod \"iptables-alerter-dwkwl\" (UID: \"5e72864c-5e7b-4f27-a09b-1c22c0833c25\") " pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.727275 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.727256 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75mc\" (UniqueName: \"kubernetes.io/projected/2cb834ae-b00c-44c5-8c4b-591c1777bf5f-kube-api-access-c75mc\") pod \"multus-additional-cni-plugins-fsr4z\" (UID: \"2cb834ae-b00c-44c5-8c4b-591c1777bf5f\") " pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.727428 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.727411 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6t8\" (UniqueName: \"kubernetes.io/projected/665d6c9a-125a-4b06-a1a5-d6b7f642e117-kube-api-access-8t6t8\") pod \"tuned-fz52j\" (UID: \"665d6c9a-125a-4b06-a1a5-d6b7f642e117\") " pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.730907 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.730885 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:32.730907 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.730909 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:32.731044 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.730920 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:32.731044 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.730988 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:53:33.230961212 +0000 UTC m=+3.049586737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:32.732468 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.732444 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnvqb\" (UniqueName: \"kubernetes.io/projected/1252f9bc-f9c8-4a62-8bbf-b5e145f0e656-kube-api-access-qnvqb\") pod \"node-ca-xt2lb\" (UID: \"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656\") " pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.817276 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817244 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-cnibin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817286 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-bin\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817363 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-cnibin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817374 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817397 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-bin\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817410 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817432 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-log-socket\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817437 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817489 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-log-socket\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817490 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817529 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-multus-certs\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817545 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-systemd-units\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817562 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-config\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817587 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-env-overrides\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817601 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-systemd-units\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817564 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-multus-certs\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.817703 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817615 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l857h\" (UniqueName: \"kubernetes.io/projected/5e16698f-ad67-45e2-8b90-cd0a144a2469-kube-api-access-l857h\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817822 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-etc-kubernetes\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817843 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-node-log\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817862 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-os-release\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817886 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-registration-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817894 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-etc-kubernetes\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817910 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-bin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817914 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-node-log\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817947 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-kubelet\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817970 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-registration-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817949 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-bin\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817980 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-kubelet\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817975 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-ovn\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.817977 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-os-release\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818005 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-ovn\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818029 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-device-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818043 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-env-overrides\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818048 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-conf-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818073 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-conf-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818090 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-config\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818099 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-socket-dir-parent\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818094 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-device-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818117 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818138 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-socket-dir-parent\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818141 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-system-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818168 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-system-cni-dir\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818180 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-netns\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.818194 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818208 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-netd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818230 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-netns\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818238 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:32.818254 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:33.318235458 +0000 UTC m=+3.136860998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:32.818683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818267 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-cni-netd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818292 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818311 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-k8s-cni-cncf-io\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818338 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgpr\" (UniqueName: \"kubernetes.io/projected/54adba6d-382e-43b7-9219-644ce4ea5f46-kube-api-access-9xgpr\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818376 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-slash\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818374 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-k8s-cni-cncf-io\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818397 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9flq\" (UniqueName: \"kubernetes.io/projected/cf488808-137d-4895-9f36-c87fdbd47441-kube-api-access-w9flq\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818420 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-netns\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818443 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-systemd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818454 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-slash\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818471 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-hostroot\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818497 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-var-lib-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818504 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-run-systemd\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818512 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-run-netns\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818520 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818548 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-hostroot\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818564 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818565 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-var-lib-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818594 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-etc-selinux\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818619 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-multus\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818627 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-run-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818644 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-etc-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818657 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-cni-multus\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818672 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-daemon-config\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818666 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-etc-selinux\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818697 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-script-lib\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818696 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818728 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovn-node-metrics-cert\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818724 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e16698f-ad67-45e2-8b90-cd0a144a2469-etc-openvswitch\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818753 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-cni-binary-copy\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818779 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-kubelet\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818806 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh2tx\" (UniqueName: \"kubernetes.io/projected/5752b3ca-4688-4db8-9995-af78bc6f30d3-kube-api-access-fh2tx\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818835 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-socket-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818843 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54adba6d-382e-43b7-9219-644ce4ea5f46-host-var-lib-kubelet\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.819886 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818859 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-sys-fs\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.820487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818931 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-sys-fs\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.820487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.818955 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf488808-137d-4895-9f36-c87fdbd47441-socket-dir\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.820487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.819267 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-cni-binary-copy\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.820487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.819285 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54adba6d-382e-43b7-9219-644ce4ea5f46-multus-daemon-config\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.820487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.819299 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovnkube-script-lib\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.821197 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.821179 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e16698f-ad67-45e2-8b90-cd0a144a2469-ovn-node-metrics-cert\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.829421 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.829397 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh2tx\" (UniqueName: \"kubernetes.io/projected/5752b3ca-4688-4db8-9995-af78bc6f30d3-kube-api-access-fh2tx\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:32.829530 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.829449 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9flq\" (UniqueName: \"kubernetes.io/projected/cf488808-137d-4895-9f36-c87fdbd47441-kube-api-access-w9flq\") pod \"aws-ebs-csi-driver-node-8qt82\" (UID: \"cf488808-137d-4895-9f36-c87fdbd47441\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.829689 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.829671 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l857h\" (UniqueName: \"kubernetes.io/projected/5e16698f-ad67-45e2-8b90-cd0a144a2469-kube-api-access-l857h\") pod \"ovnkube-node-q22pz\" (UID: \"5e16698f-ad67-45e2-8b90-cd0a144a2469\") " pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.830141 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.830081 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgpr\" (UniqueName: \"kubernetes.io/projected/54adba6d-382e-43b7-9219-644ce4ea5f46-kube-api-access-9xgpr\") pod \"multus-mw2xg\" (UID: \"54adba6d-382e-43b7-9219-644ce4ea5f46\") " pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.890038 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.890007 2538 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:32.905412 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.905382 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:32.911276 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.911256 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fz52j" Apr 20 14:53:32.920882 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.920863 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xt2lb" Apr 20 14:53:32.925388 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.925370 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" Apr 20 14:53:32.932168 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.932150 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwkwl" Apr 20 14:53:32.939116 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.939099 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" Apr 20 14:53:32.947633 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.947613 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mw2xg" Apr 20 14:53:32.952190 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.952173 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:32.973529 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:32.973508 2538 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:33.302162 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:33.302135 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf488808_137d_4895_9f36_c87fdbd47441.slice/crio-244a92d4b82ec88ddc5b985f8a1377c4c66ed02c7af6b50f354371ad116bb2cf WatchSource:0}: Error finding container 244a92d4b82ec88ddc5b985f8a1377c4c66ed02c7af6b50f354371ad116bb2cf: Status 404 returned error can't find the container with id 244a92d4b82ec88ddc5b985f8a1377c4c66ed02c7af6b50f354371ad116bb2cf Apr 20 14:53:33.302802 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:33.302760 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54adba6d_382e_43b7_9219_644ce4ea5f46.slice/crio-10dd0f64dddd3bfde5965743db972b2777e6eaaa948979b997f304eaabc9d2f9 WatchSource:0}: Error finding container 10dd0f64dddd3bfde5965743db972b2777e6eaaa948979b997f304eaabc9d2f9: Status 404 returned error can't find the container with id 10dd0f64dddd3bfde5965743db972b2777e6eaaa948979b997f304eaabc9d2f9 Apr 20 14:53:33.304206 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:33.304056 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e16698f_ad67_45e2_8b90_cd0a144a2469.slice/crio-9f077bb1936ff80ce54e133b0ccf5f752d7065e9e0519853fb53d9eb8861d892 WatchSource:0}: Error finding container 9f077bb1936ff80ce54e133b0ccf5f752d7065e9e0519853fb53d9eb8861d892: Status 404 returned error can't find the container with id 9f077bb1936ff80ce54e133b0ccf5f752d7065e9e0519853fb53d9eb8861d892 Apr 20 14:53:33.308866 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:33.307582 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1d6f1c_e0bc_4dae_beb6_31f8ba86f47d.slice/crio-170243b02bab172a760aef59a01bcdabe2d0dbdb50910f1f7befdf5213775706 WatchSource:0}: Error finding container 170243b02bab172a760aef59a01bcdabe2d0dbdb50910f1f7befdf5213775706: Status 404 returned error can't find the container with id 170243b02bab172a760aef59a01bcdabe2d0dbdb50910f1f7befdf5213775706 Apr 20 14:53:33.322413 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.322392 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:33.322496 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.322452 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:33.322547 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322536 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:33.322589 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322564 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:33.322589 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322585 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:33.322657 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322593 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:34.322574219 +0000 UTC m=+4.141199741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:33.322657 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322599 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:33.322657 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.322651 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:53:34.322637878 +0000 UTC m=+4.141263413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:33.665515 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.665412 2538 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:31 +0000 UTC" deadline="2027-11-03 01:19:04.345344979 +0000 UTC" Apr 20 14:53:33.665515 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.665454 2538 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13474h25m30.679895191s" Apr 20 14:53:33.702246 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.701631 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:33.702246 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:33.701787 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:33.710794 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.710722 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwkwl" event={"ID":"5e72864c-5e7b-4f27-a09b-1c22c0833c25","Type":"ContainerStarted","Data":"9dbcc272b0806b01e1f5b76128a3a79d7ed54c094798169d5a5015c2597babd4"} Apr 20 14:53:33.715917 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.715859 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xt2lb" event={"ID":"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656","Type":"ContainerStarted","Data":"153114c1be577a5c9638fb41db72bb2ce4139cb28f8716fa96b15cf9147acea2"} Apr 20 14:53:33.720233 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.720189 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-777tg" event={"ID":"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d","Type":"ContainerStarted","Data":"170243b02bab172a760aef59a01bcdabe2d0dbdb50910f1f7befdf5213775706"} Apr 20 14:53:33.725542 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.725209 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" event={"ID":"e688fec9147a531ae0f3ba981a4ec304","Type":"ContainerStarted","Data":"02af138d2c693220dc3a69b26f8592322b64b3dcdba154ce4731b26d6115ec7f"} Apr 20 14:53:33.727333 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.727275 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fz52j" event={"ID":"665d6c9a-125a-4b06-a1a5-d6b7f642e117","Type":"ContainerStarted","Data":"973abac9bc2565e32146daededded2b4068ef74447a7fb007e548e2154a5372d"} Apr 20 14:53:33.735525 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.735467 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerStarted","Data":"598751fd77ef743c56e480fd3c101914023cf2548206b0d61c7d3b2b00fb4afb"} Apr 20 14:53:33.740522 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.739937 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"9f077bb1936ff80ce54e133b0ccf5f752d7065e9e0519853fb53d9eb8861d892"} Apr 20 14:53:33.741677 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.740958 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" podStartSLOduration=2.740943272 podStartE2EDuration="2.740943272s" podCreationTimestamp="2026-04-20 14:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:33.740506411 +0000 UTC m=+3.559131955" watchObservedRunningTime="2026-04-20 14:53:33.740943272 +0000 UTC m=+3.559568816" Apr 20 14:53:33.745146 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.745122 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mw2xg" event={"ID":"54adba6d-382e-43b7-9219-644ce4ea5f46","Type":"ContainerStarted","Data":"10dd0f64dddd3bfde5965743db972b2777e6eaaa948979b997f304eaabc9d2f9"} Apr 20 14:53:33.747358 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:33.747300 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" event={"ID":"cf488808-137d-4895-9f36-c87fdbd47441","Type":"ContainerStarted","Data":"244a92d4b82ec88ddc5b985f8a1377c4c66ed02c7af6b50f354371ad116bb2cf"} Apr 20 14:53:34.331389 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:34.331336 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:34.331591 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:34.331424 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:34.331591 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331558 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:34.331711 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331623 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:36.331604833 +0000 UTC m=+6.150230357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:34.331772 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331727 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:34.331772 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331742 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:34.331772 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331755 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:34.331915 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.331790 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:53:36.331778822 +0000 UTC m=+6.150404346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:34.703922 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:34.703306 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:34.703922 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:34.703446 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:34.755384 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:34.755327 2538 generic.go:358] "Generic (PLEG): container finished" podID="e121b59458e83a52d173db12d00639e1" containerID="df18e517f2e18c3c2780a7cbb7b572e34197f21c8e7bc546c89452c159e8e2d8" exitCode=0 Apr 20 14:53:34.756396 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:34.756369 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerDied","Data":"df18e517f2e18c3c2780a7cbb7b572e34197f21c8e7bc546c89452c159e8e2d8"} Apr 20 14:53:35.701491 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:35.701437 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:35.701722 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:35.701601 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:35.762106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:35.761317 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerStarted","Data":"e28e9b4f8010f771b4a718256a918ec39253726c056f937805eb8db7fce50965"} Apr 20 14:53:36.346515 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:36.346473 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:36.346715 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:36.346543 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:36.346715 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346658 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:36.346715 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346690 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:36.346715 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346715 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:36.346926 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346729 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:40.34670893 +0000 UTC m=+10.165334466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:36.346926 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346729 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:36.346926 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.346769 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:53:40.346760275 +0000 UTC m=+10.165385799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:36.704061 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:36.703959 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:36.704230 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:36.704087 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:37.701931 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:37.701815 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:37.702501 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:37.701969 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:38.701371 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.700899 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:38.701371 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:38.701029 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:38.837487 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.837400 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" podStartSLOduration=7.837375583 podStartE2EDuration="7.837375583s" podCreationTimestamp="2026-04-20 14:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:35.776583174 +0000 UTC m=+5.595208720" watchObservedRunningTime="2026-04-20 14:53:38.837375583 +0000 UTC m=+8.656001129" Apr 20 14:53:38.838077 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.837559 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-64772"] Apr 20 14:53:38.840827 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.840800 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.844199 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.843731 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:53:38.844199 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.844050 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:53:38.844408 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.844359 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nzkkj\"" Apr 20 14:53:38.867175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.867074 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d674b0f-26a1-44f7-8346-4ad4d666371e-tmp-dir\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.867175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.867149 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrtm7\" (UniqueName: \"kubernetes.io/projected/2d674b0f-26a1-44f7-8346-4ad4d666371e-kube-api-access-nrtm7\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.867429 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.867226 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d674b0f-26a1-44f7-8346-4ad4d666371e-hosts-file\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.968127 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.967786 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d674b0f-26a1-44f7-8346-4ad4d666371e-tmp-dir\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.968127 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.967885 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrtm7\" (UniqueName: \"kubernetes.io/projected/2d674b0f-26a1-44f7-8346-4ad4d666371e-kube-api-access-nrtm7\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.968127 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.967920 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d674b0f-26a1-44f7-8346-4ad4d666371e-hosts-file\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.968127 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.968044 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d674b0f-26a1-44f7-8346-4ad4d666371e-hosts-file\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.968536 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.968409 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d674b0f-26a1-44f7-8346-4ad4d666371e-tmp-dir\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:38.979516 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:38.979458 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrtm7\" (UniqueName: \"kubernetes.io/projected/2d674b0f-26a1-44f7-8346-4ad4d666371e-kube-api-access-nrtm7\") pod \"node-resolver-64772\" (UID: \"2d674b0f-26a1-44f7-8346-4ad4d666371e\") " pod="openshift-dns/node-resolver-64772" Apr 20 14:53:39.153329 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:39.153293 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-64772" Apr 20 14:53:39.702337 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:39.701788 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:39.702337 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:39.701960 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:40.379940 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:40.379900 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:40.379974 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380112 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380179 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:48.380160987 +0000 UTC m=+18.198786513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380265 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380278 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380291 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:40.380560 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.380326 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:53:48.380314366 +0000 UTC m=+18.198939889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:40.703020 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:40.702527 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:40.703020 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:40.702639 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:41.701604 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:41.701569 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:41.702019 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:41.701709 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:42.701620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:42.701579 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:42.702201 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:42.701717 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:43.701978 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:43.701936 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:43.702412 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:43.702121 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:44.702049 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:44.702015 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:44.702514 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:44.702160 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:45.701286 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:45.701245 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:45.701459 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:45.701398 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:46.703940 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:46.703915 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:46.704472 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:46.704035 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:47.700991 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:47.700950 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:47.701225 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:47.701080 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:48.434228 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:48.434307 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434467 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434484 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434497 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:48.434646 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434553 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.434535001 +0000 UTC m=+34.253160536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:48.435410 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434740 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:48.435410 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.434813 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.434792535 +0000 UTC m=+34.253418069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:48.704104 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:48.704027 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:48.704104 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:48.704058 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:48.704298 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.704156 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:48.704298 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:48.704274 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:49.070853 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.070819 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t7x4d"] Apr 20 14:53:49.100792 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.100761 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.100940 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:49.100835 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:49.140829 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.140794 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-dbus\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.140995 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.140840 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.140995 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.140870 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-kubelet-config\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242207 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.242172 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-kubelet-config\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242395 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.242260 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-dbus\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242395 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.242295 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242395 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.242310 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-kubelet-config\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242395 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.242370 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4a876a88-f48c-4f22-83fa-9e878cf5029d-dbus\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.242591 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:49.242414 2538 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:49.242591 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:49.242469 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret podName:4a876a88-f48c-4f22-83fa-9e878cf5029d nodeName:}" failed. No retries permitted until 2026-04-20 14:53:49.74245033 +0000 UTC m=+19.561075852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret") pod "global-pull-secret-syncer-t7x4d" (UID: "4a876a88-f48c-4f22-83fa-9e878cf5029d") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:49.745904 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:49.745871 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:49.746274 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:49.746007 2538 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:49.746274 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:49.746067 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret podName:4a876a88-f48c-4f22-83fa-9e878cf5029d nodeName:}" failed. No retries permitted until 2026-04-20 14:53:50.746051808 +0000 UTC m=+20.564677332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret") pod "global-pull-secret-syncer-t7x4d" (UID: "4a876a88-f48c-4f22-83fa-9e878cf5029d") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:49.956119 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:53:49.956093 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d674b0f_26a1_44f7_8346_4ad4d666371e.slice/crio-77f9cddf4c4a8646b2e717f01a97c79046a3f0d86ed5db8d1d9cdf407176b898 WatchSource:0}: Error finding container 77f9cddf4c4a8646b2e717f01a97c79046a3f0d86ed5db8d1d9cdf407176b898: Status 404 returned error can't find the container with id 77f9cddf4c4a8646b2e717f01a97c79046a3f0d86ed5db8d1d9cdf407176b898 Apr 20 14:53:50.706643 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.706444 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:50.706767 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.706492 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:50.706767 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:50.706723 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:50.706887 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.706508 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:50.706887 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:50.706788 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:50.706996 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:50.706883 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:50.752852 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.752813 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:50.753568 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:50.752994 2538 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:50.753568 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:50.753071 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret podName:4a876a88-f48c-4f22-83fa-9e878cf5029d nodeName:}" failed. No retries permitted until 2026-04-20 14:53:52.753048362 +0000 UTC m=+22.571673886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret") pod "global-pull-secret-syncer-t7x4d" (UID: "4a876a88-f48c-4f22-83fa-9e878cf5029d") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:50.788031 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.787941 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-777tg" event={"ID":"fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d","Type":"ContainerStarted","Data":"3b5822448971c324c6eee1724f904d00fca4797437c01bc7af1768d188ffc003"} Apr 20 14:53:50.789733 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.789705 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fz52j" event={"ID":"665d6c9a-125a-4b06-a1a5-d6b7f642e117","Type":"ContainerStarted","Data":"ce93e0963cf02fe9a0e0239f14205f3f2d6ca5c1e9338136571de8aee9c6b60b"} Apr 20 14:53:50.791272 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.791247 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="cf7fb7ef519a793950f62a7a4a7032f4d294d85456832ee0de55a9e84427c191" exitCode=0 Apr 20 14:53:50.791383 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.791322 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"cf7fb7ef519a793950f62a7a4a7032f4d294d85456832ee0de55a9e84427c191"} Apr 20 14:53:50.794779 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794755 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"66838e428820d29ff489ad545a219a35c23da2982e87652da5c03136e01d1009"} Apr 20 14:53:50.794861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794786 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"a3236fffa1be09d67707f55130915a5a9027887fac40fe0ba3d6e6131adc121f"} Apr 20 14:53:50.794861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794800 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"a3f8d01c60956632f2baa2ce1e5d1be8c5f57b7aabd80e8b4656005758205e6f"} Apr 20 14:53:50.794861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794813 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"1b3e03a7f04013d28fbf7629af60b982a65a604c276dc02403b6834f0a35c2be"} Apr 20 14:53:50.794861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794825 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"c2e9dc124aac559a8597f2f3c8a83f7adf407e723da0bd147b9fb321afe92d6f"} Apr 20 14:53:50.794861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.794837 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"edd80b7b7acebc878776f5b2aff3c50ff5921e1e296ed39338248e8b346d0cad"} Apr 20 14:53:50.796206 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.796183 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mw2xg" event={"ID":"54adba6d-382e-43b7-9219-644ce4ea5f46","Type":"ContainerStarted","Data":"ceae5a79da1576f5ead144f7f039e3d59675d21485e985cd5c28b558f509d787"} Apr 20 14:53:50.797901 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.797882 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" event={"ID":"cf488808-137d-4895-9f36-c87fdbd47441","Type":"ContainerStarted","Data":"5854ff860f6f60c2f727c655d8ba4a075bd6cc957eb6e49b13505388c7190560"} Apr 20 14:53:50.799651 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.799633 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-64772" event={"ID":"2d674b0f-26a1-44f7-8346-4ad4d666371e","Type":"ContainerStarted","Data":"b0fe27a094088a0ce43d64afac2b79677928e0ee23cec19c9c0cd2c58e34a8d0"} Apr 20 14:53:50.799740 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.799657 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-64772" event={"ID":"2d674b0f-26a1-44f7-8346-4ad4d666371e","Type":"ContainerStarted","Data":"77f9cddf4c4a8646b2e717f01a97c79046a3f0d86ed5db8d1d9cdf407176b898"} Apr 20 14:53:50.800992 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.800968 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xt2lb" event={"ID":"1252f9bc-f9c8-4a62-8bbf-b5e145f0e656","Type":"ContainerStarted","Data":"d350e10911eef2e6791396038b7bd176925d6749e2161f000e8fd402f851bebc"} Apr 20 14:53:50.804384 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.804322 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-777tg" podStartSLOduration=4.159990589 podStartE2EDuration="20.80430918s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.311012569 +0000 UTC m=+3.129638104" lastFinishedPulling="2026-04-20 14:53:49.955331174 +0000 UTC m=+19.773956695" observedRunningTime="2026-04-20 14:53:50.803588421 +0000 UTC m=+20.622213955" watchObservedRunningTime="2026-04-20 14:53:50.80430918 +0000 UTC m=+20.622934723" Apr 20 14:53:50.817462 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.817419 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-64772" podStartSLOduration=12.817407167 podStartE2EDuration="12.817407167s" podCreationTimestamp="2026-04-20 14:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:50.816568278 +0000 UTC m=+20.635193822" watchObservedRunningTime="2026-04-20 14:53:50.817407167 +0000 UTC m=+20.636032712" Apr 20 14:53:50.855645 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.855606 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fz52j" podStartSLOduration=4.177142321 podStartE2EDuration="20.855592842s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.31444356 +0000 UTC m=+3.133069081" lastFinishedPulling="2026-04-20 14:53:49.992894078 +0000 UTC m=+19.811519602" observedRunningTime="2026-04-20 14:53:50.839812066 +0000 UTC m=+20.658437610" watchObservedRunningTime="2026-04-20 14:53:50.855592842 +0000 UTC m=+20.674218385" Apr 20 14:53:50.856097 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.856063 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mw2xg" podStartSLOduration=4.030398231 podStartE2EDuration="20.856053421s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.304471052 +0000 UTC m=+3.123096573" lastFinishedPulling="2026-04-20 14:53:50.130126238 +0000 UTC m=+19.948751763" observedRunningTime="2026-04-20 14:53:50.854650756 +0000 UTC m=+20.673276299" watchObservedRunningTime="2026-04-20 14:53:50.856053421 +0000 UTC m=+20.674678966" Apr 20 14:53:50.870387 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:50.870324 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xt2lb" podStartSLOduration=12.079757376 podStartE2EDuration="20.870314111s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.314775366 +0000 UTC m=+3.133400887" lastFinishedPulling="2026-04-20 14:53:42.105332086 +0000 UTC m=+11.923957622" observedRunningTime="2026-04-20 14:53:50.86982372 +0000 UTC m=+20.688449266" watchObservedRunningTime="2026-04-20 14:53:50.870314111 +0000 UTC m=+20.688939655" Apr 20 14:53:51.298075 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.298049 2538 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:53:51.675596 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.675493 2538 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:53:51.29807074Z","UUID":"b880a61f-f07c-4f34-a476-f1b8201c3d8b","Handler":null,"Name":"","Endpoint":""} Apr 20 14:53:51.678202 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.678175 2538 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:53:51.678360 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.678213 2538 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:53:51.806558 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.805288 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" event={"ID":"cf488808-137d-4895-9f36-c87fdbd47441","Type":"ContainerStarted","Data":"0a31d969f786c7151b57273138c648ec6cc5cca87e98d7447d2cae9055169a55"} Apr 20 14:53:51.807613 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.807580 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwkwl" event={"ID":"5e72864c-5e7b-4f27-a09b-1c22c0833c25","Type":"ContainerStarted","Data":"4b49a21f9bd5191d604b3a8365742470c64b3f243c29b79a50e32a3e6e19e6e4"} Apr 20 14:53:51.820701 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:51.820650 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dwkwl" podStartSLOduration=5.215161604 podStartE2EDuration="21.820635195s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.31572244 +0000 UTC m=+3.134347961" lastFinishedPulling="2026-04-20 14:53:49.921196032 +0000 UTC m=+19.739821552" observedRunningTime="2026-04-20 14:53:51.82048083 +0000 UTC m=+21.639106372" watchObservedRunningTime="2026-04-20 14:53:51.820635195 +0000 UTC m=+21.639260741" Apr 20 14:53:52.701861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:52.701645 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:52.702087 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:52.701644 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:52.702087 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:52.701960 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:52.702087 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:52.701710 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:52.702087 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:52.702041 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:52.702310 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:52.702150 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:52.768081 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:52.768046 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:52.768225 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:52.768185 2538 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:52.768291 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:52.768248 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret podName:4a876a88-f48c-4f22-83fa-9e878cf5029d nodeName:}" failed. No retries permitted until 2026-04-20 14:53:56.768230516 +0000 UTC m=+26.586856042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret") pod "global-pull-secret-syncer-t7x4d" (UID: "4a876a88-f48c-4f22-83fa-9e878cf5029d") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:52.814752 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:52.814721 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" event={"ID":"cf488808-137d-4895-9f36-c87fdbd47441","Type":"ContainerStarted","Data":"fde4dfccbc6b7dddc90e3b1674aecb8eaf0422ef30953e2f3db348f6bad91364"} Apr 20 14:53:53.820248 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:53.820213 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"988aeb2f5aede0b273ebeff81c291af177659951edf2f2b847aed286f84bdf56"} Apr 20 14:53:54.216249 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.216159 2538 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:54.216826 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.216807 2538 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:54.233831 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.233786 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8qt82" podStartSLOduration=5.484296921 podStartE2EDuration="24.233770683s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.30369951 +0000 UTC m=+3.122325034" lastFinishedPulling="2026-04-20 14:53:52.053173272 +0000 UTC m=+21.871798796" observedRunningTime="2026-04-20 14:53:52.836064894 +0000 UTC m=+22.654690437" watchObservedRunningTime="2026-04-20 14:53:54.233770683 +0000 UTC m=+24.052396226" Apr 20 14:53:54.701950 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.701915 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:54.701950 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.701943 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:54.702188 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.701945 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:54.702188 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:54.702063 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:54.702275 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:54.702182 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:54.702359 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:54.702313 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:54.821883 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.821847 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:54.822411 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:54.822393 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-777tg" Apr 20 14:53:55.824864 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.824827 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="c60ce976d4f598b2b917f3f979ea5d11aa226a324bc3fa877358cda8ebc6e6f1" exitCode=0 Apr 20 14:53:55.825253 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.824915 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"c60ce976d4f598b2b917f3f979ea5d11aa226a324bc3fa877358cda8ebc6e6f1"} Apr 20 14:53:55.828196 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.828133 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" event={"ID":"5e16698f-ad67-45e2-8b90-cd0a144a2469","Type":"ContainerStarted","Data":"3351ed27e3c352006fd2010a10e0566fe9f3ef3263495f749a693912179205a4"} Apr 20 14:53:55.828503 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.828485 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:55.828578 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.828510 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:55.842601 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:55.842579 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:56.701303 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.701272 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:56.701508 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.701272 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:56.701508 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:56.701397 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:56.701508 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.701272 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:56.701508 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:56.701477 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:56.701674 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:56.701538 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:56.795019 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.794985 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:56.795178 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:56.795119 2538 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:56.795178 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:56.795176 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret podName:4a876a88-f48c-4f22-83fa-9e878cf5029d nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.795159645 +0000 UTC m=+34.613785177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret") pod "global-pull-secret-syncer-t7x4d" (UID: "4a876a88-f48c-4f22-83fa-9e878cf5029d") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:56.830683 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.830659 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:56.845195 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.845163 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:53:56.887234 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:56.887165 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" podStartSLOduration=10.148432785 podStartE2EDuration="26.887146946s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.306250748 +0000 UTC m=+3.124876269" lastFinishedPulling="2026-04-20 14:53:50.044964896 +0000 UTC m=+19.863590430" observedRunningTime="2026-04-20 14:53:55.908033517 +0000 UTC m=+25.726659061" watchObservedRunningTime="2026-04-20 14:53:56.887146946 +0000 UTC m=+26.705772492" Apr 20 14:53:57.195206 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.195161 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fdmrj"] Apr 20 14:53:57.195397 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.195331 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:57.195458 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:57.195442 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:57.196138 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.196087 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rdpq8"] Apr 20 14:53:57.196266 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.196212 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:57.196334 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:57.196303 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:57.196765 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.196742 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t7x4d"] Apr 20 14:53:57.196875 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.196835 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:57.196945 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:57.196928 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:57.834071 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.834036 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="b5e0bcec9fc593ce34243c58d8bbdb99262b5c40561d16f038d250a6627d5b3b" exitCode=0 Apr 20 14:53:57.834662 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:57.834118 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"b5e0bcec9fc593ce34243c58d8bbdb99262b5c40561d16f038d250a6627d5b3b"} Apr 20 14:53:58.701863 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:58.701829 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:53:58.701987 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:58.701836 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:53:58.701987 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:58.701939 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:53:58.702065 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:58.701836 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:53:58.702065 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:58.702021 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:53:58.702129 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:53:58.702079 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:53:58.837669 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:58.837636 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="6d362498bfd2249f170dbae59d7f21219501ec1e559f64b24bb11ae1f0f24205" exitCode=0 Apr 20 14:53:58.838094 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:53:58.837721 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"6d362498bfd2249f170dbae59d7f21219501ec1e559f64b24bb11ae1f0f24205"} Apr 20 14:54:00.701994 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:00.701800 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:00.702513 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:00.702075 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:54:00.702513 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:00.701892 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:00.702513 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:00.702163 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:54:00.702513 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:00.701872 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:54:00.702513 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:00.702250 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:54:02.700970 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:02.700934 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:02.701466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:02.700935 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:02.701466 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:02.701060 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t7x4d" podUID="4a876a88-f48c-4f22-83fa-9e878cf5029d" Apr 20 14:54:02.701466 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:02.701139 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdpq8" podUID="70b39118-4141-405c-9b0c-b59eec25451c" Apr 20 14:54:02.701466 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:02.700940 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:54:02.701466 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:02.701258 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:54:04.008301 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.008223 2538 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeReady" Apr 20 14:54:04.008841 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.008408 2538 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:54:04.049637 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.049603 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v5pr2"] Apr 20 14:54:04.085365 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.085316 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kw4l7"] Apr 20 14:54:04.085528 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.085519 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.088502 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.088434 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:54:04.088502 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.088448 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:54:04.088502 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.088468 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:54:04.101206 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.100949 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5pr2"] Apr 20 14:54:04.101206 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.100984 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kw4l7"] Apr 20 14:54:04.101206 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.101095 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.103705 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.103683 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:54:04.103822 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.103743 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:54:04.103822 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.103788 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:54:04.103943 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.103743 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:54:04.151774 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.151733 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-config-volume\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.151937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.151802 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-tmp-dir\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.151937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.151837 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.151937 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.151866 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkdr\" (UniqueName: \"kubernetes.io/projected/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-kube-api-access-pfkdr\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.252869 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.252831 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-config-volume\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.253043 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.252902 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-tmp-dir\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.253043 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.252936 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfdf\" (UniqueName: \"kubernetes.io/projected/50a05c11-ab26-4873-920e-803fdbe14912-kube-api-access-krfdf\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.253043 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.252977 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.253043 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.253005 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkdr\" (UniqueName: \"kubernetes.io/projected/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-kube-api-access-pfkdr\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.253255 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.253124 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.253255 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.253130 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:04.253368 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.253271 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.753247528 +0000 UTC m=+34.571873074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:04.253368 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.253271 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-tmp-dir\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.253562 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.253532 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-config-volume\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.264636 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.264556 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkdr\" (UniqueName: \"kubernetes.io/projected/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-kube-api-access-pfkdr\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.353538 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.353502 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.353720 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.353571 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krfdf\" (UniqueName: \"kubernetes.io/projected/50a05c11-ab26-4873-920e-803fdbe14912-kube-api-access-krfdf\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.353720 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.353663 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:04.353833 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.353737 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.853716471 +0000 UTC m=+34.672342008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:04.362258 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.362229 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfdf\" (UniqueName: \"kubernetes.io/projected/50a05c11-ab26-4873-920e-803fdbe14912-kube-api-access-krfdf\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.454584 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.454548 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:54:04.454729 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454706 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:04.454729 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.454717 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:04.454796 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454765 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:36.454751195 +0000 UTC m=+66.273376715 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:04.454841 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454831 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:54:04.454873 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454844 2538 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:54:04.454873 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454854 2538 projected.go:194] Error preparing data for projected volume kube-api-access-frdcz for pod openshift-network-diagnostics/network-check-target-rdpq8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:04.454932 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.454905 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz podName:70b39118-4141-405c-9b0c-b59eec25451c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:36.454892926 +0000 UTC m=+66.273518446 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-frdcz" (UniqueName: "kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz") pod "network-check-target-rdpq8" (UID: "70b39118-4141-405c-9b0c-b59eec25451c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:04.701544 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.701511 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:54:04.701729 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.701511 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:04.701729 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.701511 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:04.705300 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.704883 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:04.705300 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.704984 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:04.705300 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.705156 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:04.707136 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.705635 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:54:04.707136 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.705873 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:04.707136 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.706074 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:54:04.757176 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.757142 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:04.757292 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.757276 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:04.757362 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.757335 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:05.757321904 +0000 UTC m=+35.575947425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:04.858333 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.858309 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:04.858458 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.858356 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:04.858519 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.858457 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:04.858519 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:04.858515 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:05.858500166 +0000 UTC m=+35.677125690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:04.860500 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:04.860481 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4a876a88-f48c-4f22-83fa-9e878cf5029d-original-pull-secret\") pod \"global-pull-secret-syncer-t7x4d\" (UID: \"4a876a88-f48c-4f22-83fa-9e878cf5029d\") " pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:05.019900 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.019874 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t7x4d" Apr 20 14:54:05.174390 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.174324 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t7x4d"] Apr 20 14:54:05.182397 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:54:05.182363 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a876a88_f48c_4f22_83fa_9e878cf5029d.slice/crio-92db28210fe839ffea441b15c1b187bbe01c4348590e63f664c687720c866b14 WatchSource:0}: Error finding container 92db28210fe839ffea441b15c1b187bbe01c4348590e63f664c687720c866b14: Status 404 returned error can't find the container with id 92db28210fe839ffea441b15c1b187bbe01c4348590e63f664c687720c866b14 Apr 20 14:54:05.765055 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.765016 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:05.765248 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:05.765177 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:05.765248 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:05.765241 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:07.765226008 +0000 UTC m=+37.583851529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:05.855615 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.855579 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="f8cd30a6c5b3e9b9493fde6a577c132574ec4ee863a622ccc1077f68fa6fb946" exitCode=0 Apr 20 14:54:05.855778 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.855664 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"f8cd30a6c5b3e9b9493fde6a577c132574ec4ee863a622ccc1077f68fa6fb946"} Apr 20 14:54:05.856729 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.856706 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t7x4d" event={"ID":"4a876a88-f48c-4f22-83fa-9e878cf5029d","Type":"ContainerStarted","Data":"92db28210fe839ffea441b15c1b187bbe01c4348590e63f664c687720c866b14"} Apr 20 14:54:05.865923 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:05.865902 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:05.866041 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:05.866018 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:05.866097 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:05.866082 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:07.866062306 +0000 UTC m=+37.684687839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:06.862104 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:06.862069 2538 generic.go:358] "Generic (PLEG): container finished" podID="2cb834ae-b00c-44c5-8c4b-591c1777bf5f" containerID="0e6501303a64e30b6f409ece1ed655de4ef88d6d76c3c0cc07576fc9f6e9354f" exitCode=0 Apr 20 14:54:06.862104 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:06.862110 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerDied","Data":"0e6501303a64e30b6f409ece1ed655de4ef88d6d76c3c0cc07576fc9f6e9354f"} Apr 20 14:54:07.777633 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:07.777598 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:07.777834 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:07.777775 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:07.777951 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:07.777852 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:11.777833082 +0000 UTC m=+41.596458603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:07.870943 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:07.870891 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" event={"ID":"2cb834ae-b00c-44c5-8c4b-591c1777bf5f","Type":"ContainerStarted","Data":"51a7571f60868cbfae1b3363f74f5bf0c52f4651a6710bbe50787717704542e8"} Apr 20 14:54:07.878530 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:07.878497 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:07.878680 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:07.878640 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:07.878739 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:07.878716 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:11.878696074 +0000 UTC m=+41.697321600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:07.895685 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:07.895619 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fsr4z" podStartSLOduration=6.360137135 podStartE2EDuration="37.8956001s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:53:33.312248697 +0000 UTC m=+3.130874218" lastFinishedPulling="2026-04-20 14:54:04.847711663 +0000 UTC m=+34.666337183" observedRunningTime="2026-04-20 14:54:07.895077445 +0000 UTC m=+37.713702989" watchObservedRunningTime="2026-04-20 14:54:07.8956001 +0000 UTC m=+37.714225647" Apr 20 14:54:10.877818 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:10.877783 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t7x4d" event={"ID":"4a876a88-f48c-4f22-83fa-9e878cf5029d","Type":"ContainerStarted","Data":"de5485b1e66f0014aa3c6f9b48a99f43eef1070b73b9ca10287d6a5f6637a48e"} Apr 20 14:54:10.892306 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:10.892261 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t7x4d" podStartSLOduration=16.910176704 podStartE2EDuration="21.89224811s" podCreationTimestamp="2026-04-20 14:53:49 +0000 UTC" firstStartedPulling="2026-04-20 14:54:05.184514272 +0000 UTC m=+35.003139793" lastFinishedPulling="2026-04-20 14:54:10.166585676 +0000 UTC m=+39.985211199" observedRunningTime="2026-04-20 14:54:10.891977078 +0000 UTC m=+40.710602618" watchObservedRunningTime="2026-04-20 14:54:10.89224811 +0000 UTC m=+40.710873663" Apr 20 14:54:11.330459 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.330426 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh"] Apr 20 14:54:11.360871 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.360836 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx"] Apr 20 14:54:11.361023 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.360963 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.363995 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.363962 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 14:54:11.364129 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.364091 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 14:54:11.364239 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.364159 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 14:54:11.364239 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.364166 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cjb9b\"" Apr 20 14:54:11.364442 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.364427 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 14:54:11.375656 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.375630 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x"] Apr 20 14:54:11.375952 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.375931 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.378618 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.378599 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 14:54:11.399861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.399840 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh"] Apr 20 14:54:11.399861 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.399863 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx"] Apr 20 14:54:11.400013 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.399874 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x"] Apr 20 14:54:11.400013 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.399960 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.402687 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.402667 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 14:54:11.402797 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.402706 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 14:54:11.402797 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.402762 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 14:54:11.402911 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.402794 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 14:54:11.506270 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506230 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.506449 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506287 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d56359b-6635-40d6-beac-7e5a10cd4de1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.506449 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506320 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgbf\" (UniqueName: \"kubernetes.io/projected/5d56359b-6635-40d6-beac-7e5a10cd4de1-kube-api-access-zrgbf\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.506449 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506373 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.506449 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506412 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5927be3a-bd47-4737-bf39-2788500862ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.506449 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506435 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.506647 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506460 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.506647 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506475 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-tmp\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.506647 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506491 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t6pm\" (UniqueName: \"kubernetes.io/projected/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-kube-api-access-9t6pm\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.506647 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506526 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.506647 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.506553 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsgj4\" (UniqueName: \"kubernetes.io/projected/5927be3a-bd47-4737-bf39-2788500862ce-kube-api-access-rsgj4\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607432 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607355 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.607432 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607408 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d56359b-6635-40d6-beac-7e5a10cd4de1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607436 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgbf\" (UniqueName: \"kubernetes.io/projected/5d56359b-6635-40d6-beac-7e5a10cd4de1-kube-api-access-zrgbf\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607463 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607508 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5927be3a-bd47-4737-bf39-2788500862ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607543 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607568 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607590 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-tmp\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.607620 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607613 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t6pm\" (UniqueName: \"kubernetes.io/projected/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-kube-api-access-9t6pm\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.607947 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607901 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607994 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607948 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsgj4\" (UniqueName: \"kubernetes.io/projected/5927be3a-bd47-4737-bf39-2788500862ce-kube-api-access-rsgj4\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.607994 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.607971 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-tmp\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.608398 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.608372 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5927be3a-bd47-4737-bf39-2788500862ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.611262 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611241 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d56359b-6635-40d6-beac-7e5a10cd4de1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.611392 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611285 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.611392 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611300 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.611392 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611286 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.611392 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611336 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-hub\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.611602 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.611565 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5927be3a-bd47-4737-bf39-2788500862ce-ca\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.616186 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.616153 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsgj4\" (UniqueName: \"kubernetes.io/projected/5927be3a-bd47-4737-bf39-2788500862ce-kube-api-access-rsgj4\") pod \"cluster-proxy-proxy-agent-76f5bb6df5-6zq9x\" (UID: \"5927be3a-bd47-4737-bf39-2788500862ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.616289 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.616252 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgbf\" (UniqueName: \"kubernetes.io/projected/5d56359b-6635-40d6-beac-7e5a10cd4de1-kube-api-access-zrgbf\") pod \"managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh\" (UID: \"5d56359b-6635-40d6-beac-7e5a10cd4de1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.616461 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.616443 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t6pm\" (UniqueName: \"kubernetes.io/projected/9a1565e7-1ef9-4f8a-8bd8-d45090a0e622-kube-api-access-9t6pm\") pod \"klusterlet-addon-workmgr-7bc94964d5-tp4rx\" (UID: \"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.684587 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.684555 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" Apr 20 14:54:11.691474 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.691444 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:11.708357 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.708303 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:54:11.808939 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.808899 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:11.809165 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:11.809138 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:11.809258 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:11.809222 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:19.809203193 +0000 UTC m=+49.627828715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:11.834955 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.834925 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx"] Apr 20 14:54:11.838636 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:54:11.838604 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a1565e7_1ef9_4f8a_8bd8_d45090a0e622.slice/crio-be104f9703016c2a4fa3a7330288ef6620ed6d244ba253b0eab8f3e9404bfcc2 WatchSource:0}: Error finding container be104f9703016c2a4fa3a7330288ef6620ed6d244ba253b0eab8f3e9404bfcc2: Status 404 returned error can't find the container with id be104f9703016c2a4fa3a7330288ef6620ed6d244ba253b0eab8f3e9404bfcc2 Apr 20 14:54:11.853322 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.853288 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh"] Apr 20 14:54:11.857415 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:54:11.857391 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d56359b_6635_40d6_beac_7e5a10cd4de1.slice/crio-7f4e308c21df57515c6bb276047f2da923dceccb3fcc9ca7b8198519b134ad87 WatchSource:0}: Error finding container 7f4e308c21df57515c6bb276047f2da923dceccb3fcc9ca7b8198519b134ad87: Status 404 returned error can't find the container with id 7f4e308c21df57515c6bb276047f2da923dceccb3fcc9ca7b8198519b134ad87 Apr 20 14:54:11.864931 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.864905 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x"] Apr 20 14:54:11.868524 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:54:11.868496 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5927be3a_bd47_4737_bf39_2788500862ce.slice/crio-90ca56a33119898be21964dfccea8f141903bec95feeb1e75329f518d5e1aa96 WatchSource:0}: Error finding container 90ca56a33119898be21964dfccea8f141903bec95feeb1e75329f518d5e1aa96: Status 404 returned error can't find the container with id 90ca56a33119898be21964dfccea8f141903bec95feeb1e75329f518d5e1aa96 Apr 20 14:54:11.880037 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.880010 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" event={"ID":"5d56359b-6635-40d6-beac-7e5a10cd4de1","Type":"ContainerStarted","Data":"7f4e308c21df57515c6bb276047f2da923dceccb3fcc9ca7b8198519b134ad87"} Apr 20 14:54:11.881115 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.881091 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerStarted","Data":"90ca56a33119898be21964dfccea8f141903bec95feeb1e75329f518d5e1aa96"} Apr 20 14:54:11.882074 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.882046 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" event={"ID":"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622","Type":"ContainerStarted","Data":"be104f9703016c2a4fa3a7330288ef6620ed6d244ba253b0eab8f3e9404bfcc2"} Apr 20 14:54:11.909723 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:11.909693 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:11.909858 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:11.909830 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:11.909910 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:11.909903 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:19.90988738 +0000 UTC m=+49.728512906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:18.899543 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.899501 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerStarted","Data":"e2b5265955f60c61583d89ce6537ad84f18a36ebff1bbfd332549995eda2ab87"} Apr 20 14:54:18.900924 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.900897 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" event={"ID":"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622","Type":"ContainerStarted","Data":"2c19b93191992070d40a73afc2009015cf43bd0d772ce5f9f0a75e6ca5c55b78"} Apr 20 14:54:18.901079 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.901067 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:18.902247 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.902224 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" event={"ID":"5d56359b-6635-40d6-beac-7e5a10cd4de1","Type":"ContainerStarted","Data":"9db9d8fc8c3c3ffcc006a0d0929b37c51a44072977f4ef70167c591c21e6149c"} Apr 20 14:54:18.903136 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.903115 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:54:18.926962 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.926918 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" podStartSLOduration=1.793056973 podStartE2EDuration="7.926902738s" podCreationTimestamp="2026-04-20 14:54:11 +0000 UTC" firstStartedPulling="2026-04-20 14:54:11.840564202 +0000 UTC m=+41.659189723" lastFinishedPulling="2026-04-20 14:54:17.974409952 +0000 UTC m=+47.793035488" observedRunningTime="2026-04-20 14:54:18.926698476 +0000 UTC m=+48.745324031" watchObservedRunningTime="2026-04-20 14:54:18.926902738 +0000 UTC m=+48.745528281" Apr 20 14:54:18.968234 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:18.968183 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" podStartSLOduration=1.8680807590000001 podStartE2EDuration="7.968169509s" podCreationTimestamp="2026-04-20 14:54:11 +0000 UTC" firstStartedPulling="2026-04-20 14:54:11.85930767 +0000 UTC m=+41.677933191" lastFinishedPulling="2026-04-20 14:54:17.959396421 +0000 UTC m=+47.778021941" observedRunningTime="2026-04-20 14:54:18.968153527 +0000 UTC m=+48.786779071" watchObservedRunningTime="2026-04-20 14:54:18.968169509 +0000 UTC m=+48.786795046" Apr 20 14:54:19.873785 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:19.873735 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:19.874053 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:19.873901 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:19.874053 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:19.873973 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:35.873955028 +0000 UTC m=+65.692580549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:19.974505 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:19.974467 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:19.974969 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:19.974627 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:19.974969 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:19.974692 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:35.974677698 +0000 UTC m=+65.793303219 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:20.907557 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:20.907525 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerStarted","Data":"de045e72099b4fd436465591cf115c1e49c69832189772dfb268fa5cd5f1b46f"} Apr 20 14:54:21.911706 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:21.911669 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerStarted","Data":"4a2d8e59cb75790661db5b6dcdf00868ce814dd46af7d3903ab42c51efa43f21"} Apr 20 14:54:21.933463 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:21.933414 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" podStartSLOduration=2.049504189 podStartE2EDuration="10.93339917s" podCreationTimestamp="2026-04-20 14:54:11 +0000 UTC" firstStartedPulling="2026-04-20 14:54:11.870614136 +0000 UTC m=+41.689239657" lastFinishedPulling="2026-04-20 14:54:20.754509102 +0000 UTC m=+50.573134638" observedRunningTime="2026-04-20 14:54:21.931432681 +0000 UTC m=+51.750058226" watchObservedRunningTime="2026-04-20 14:54:21.93339917 +0000 UTC m=+51.752024713" Apr 20 14:54:28.848171 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:28.848141 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q22pz" Apr 20 14:54:35.890050 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:35.890008 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:54:35.890624 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:35.890183 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:35.890624 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:35.890267 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:55:07.890245964 +0000 UTC m=+97.708871489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:54:35.991066 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:35.991033 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:54:35.991254 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:35.991194 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:35.991326 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:35.991280 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:07.991257403 +0000 UTC m=+97.809882924 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:54:36.493612 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.493573 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:36.493802 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.493652 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:54:36.497070 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.497051 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:36.497070 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.497063 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:36.504494 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:36.504475 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:54:36.504610 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:54:36.504529 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:40.504513949 +0000 UTC m=+130.323139470 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : secret "metrics-daemon-secret" not found Apr 20 14:54:36.506801 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.506784 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:36.517422 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.517397 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdcz\" (UniqueName: \"kubernetes.io/projected/70b39118-4141-405c-9b0c-b59eec25451c-kube-api-access-frdcz\") pod \"network-check-target-rdpq8\" (UID: \"70b39118-4141-405c-9b0c-b59eec25451c\") " pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:36.526697 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.526678 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:36.534758 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.534741 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:54:36.642818 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.642788 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rdpq8"] Apr 20 14:54:36.646142 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:54:36.646119 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70b39118_4141_405c_9b0c_b59eec25451c.slice/crio-4b0f1e2008871dee798ab4e38ae960717efa7e3b489f48408f0a913dc14c8da9 WatchSource:0}: Error finding container 4b0f1e2008871dee798ab4e38ae960717efa7e3b489f48408f0a913dc14c8da9: Status 404 returned error can't find the container with id 4b0f1e2008871dee798ab4e38ae960717efa7e3b489f48408f0a913dc14c8da9 Apr 20 14:54:36.940253 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:36.940217 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rdpq8" event={"ID":"70b39118-4141-405c-9b0c-b59eec25451c","Type":"ContainerStarted","Data":"4b0f1e2008871dee798ab4e38ae960717efa7e3b489f48408f0a913dc14c8da9"} Apr 20 14:54:39.949962 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:39.949923 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rdpq8" event={"ID":"70b39118-4141-405c-9b0c-b59eec25451c","Type":"ContainerStarted","Data":"8ef7ba8434a11b2801adc0e41d1f848420e9455b5e55b2541d365b0cee1bae5e"} Apr 20 14:54:39.950365 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:54:39.950070 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:55:07.920857 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:07.920802 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:55:07.921332 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:07.920965 2538 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:07.921332 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:07.921043 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls podName:3dfaac5e-6a5e-40e4-81fa-19962c0d578f nodeName:}" failed. No retries permitted until 2026-04-20 14:56:11.921026643 +0000 UTC m=+161.739652164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls") pod "dns-default-v5pr2" (UID: "3dfaac5e-6a5e-40e4-81fa-19962c0d578f") : secret "dns-default-metrics-tls" not found Apr 20 14:55:08.022215 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:08.022181 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:55:08.022377 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:08.022322 2538 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:08.022420 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:08.022400 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert podName:50a05c11-ab26-4873-920e-803fdbe14912 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.022384279 +0000 UTC m=+161.841009800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert") pod "ingress-canary-kw4l7" (UID: "50a05c11-ab26-4873-920e-803fdbe14912") : secret "canary-serving-cert" not found Apr 20 14:55:10.954834 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:10.954801 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rdpq8" Apr 20 14:55:10.969614 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:10.969561 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rdpq8" podStartSLOduration=98.284311227 podStartE2EDuration="1m40.969544419s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:54:36.648055432 +0000 UTC m=+66.466680954" lastFinishedPulling="2026-04-20 14:54:39.333288625 +0000 UTC m=+69.151914146" observedRunningTime="2026-04-20 14:54:39.964949409 +0000 UTC m=+69.783574953" watchObservedRunningTime="2026-04-20 14:55:10.969544419 +0000 UTC m=+100.788170008" Apr 20 14:55:40.551375 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:40.551306 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:55:40.551857 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:40.551444 2538 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:55:40.551857 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:55:40.551516 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs podName:5752b3ca-4688-4db8-9995-af78bc6f30d3 nodeName:}" failed. No retries permitted until 2026-04-20 14:57:42.551498792 +0000 UTC m=+252.370124313 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs") pod "network-metrics-daemon-fdmrj" (UID: "5752b3ca-4688-4db8-9995-af78bc6f30d3") : secret "metrics-daemon-secret" not found Apr 20 14:55:55.094064 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:55.094034 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-64772_2d674b0f-26a1-44f7-8346-4ad4d666371e/dns-node-resolver/0.log" Apr 20 14:55:56.094315 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:55:56.094287 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xt2lb_1252f9bc-f9c8-4a62-8bbf-b5e145f0e656/node-ca/0.log" Apr 20 14:56:07.097243 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:56:07.097190 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v5pr2" podUID="3dfaac5e-6a5e-40e4-81fa-19962c0d578f" Apr 20 14:56:07.112280 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:56:07.112255 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kw4l7" podUID="50a05c11-ab26-4873-920e-803fdbe14912" Apr 20 14:56:07.153852 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:07.153825 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:07.714107 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:56:07.714067 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fdmrj" podUID="5752b3ca-4688-4db8-9995-af78bc6f30d3" Apr 20 14:56:11.958600 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:11.958563 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:11.960735 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:11.960713 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dfaac5e-6a5e-40e4-81fa-19962c0d578f-metrics-tls\") pod \"dns-default-v5pr2\" (UID: \"3dfaac5e-6a5e-40e4-81fa-19962c0d578f\") " pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:12.059224 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:12.059193 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:56:12.061558 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:12.061534 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50a05c11-ab26-4873-920e-803fdbe14912-cert\") pod \"ingress-canary-kw4l7\" (UID: \"50a05c11-ab26-4873-920e-803fdbe14912\") " pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:56:12.257355 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:12.257260 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:56:12.264509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:12.264485 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:12.380649 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:12.380614 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5pr2"] Apr 20 14:56:12.384626 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:56:12.384598 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfaac5e_6a5e_40e4_81fa_19962c0d578f.slice/crio-ccfe80b7a2372bcbfbb9d16eb4e0d754d6fb03e4bfd3a4c4478078b70a2a1a2f WatchSource:0}: Error finding container ccfe80b7a2372bcbfbb9d16eb4e0d754d6fb03e4bfd3a4c4478078b70a2a1a2f: Status 404 returned error can't find the container with id ccfe80b7a2372bcbfbb9d16eb4e0d754d6fb03e4bfd3a4c4478078b70a2a1a2f Apr 20 14:56:13.168545 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:13.168504 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5pr2" event={"ID":"3dfaac5e-6a5e-40e4-81fa-19962c0d578f","Type":"ContainerStarted","Data":"ccfe80b7a2372bcbfbb9d16eb4e0d754d6fb03e4bfd3a4c4478078b70a2a1a2f"} Apr 20 14:56:14.173070 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:14.173029 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5pr2" event={"ID":"3dfaac5e-6a5e-40e4-81fa-19962c0d578f","Type":"ContainerStarted","Data":"6f5be80643940f04c75e95f4e9dd13cc4026afed7ee942c67e9dfc6c354ef854"} Apr 20 14:56:14.173070 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:14.173072 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5pr2" event={"ID":"3dfaac5e-6a5e-40e4-81fa-19962c0d578f","Type":"ContainerStarted","Data":"764d43e203e0835a934c81d4ab503a0dd3ade4270e7edf72f4ab2d212a9a7839"} Apr 20 14:56:14.173541 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:14.173221 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:14.188864 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:14.188816 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v5pr2" podStartSLOduration=128.99575563 podStartE2EDuration="2m10.188801501s" podCreationTimestamp="2026-04-20 14:54:04 +0000 UTC" firstStartedPulling="2026-04-20 14:56:12.386803785 +0000 UTC m=+162.205429306" lastFinishedPulling="2026-04-20 14:56:13.579849655 +0000 UTC m=+163.398475177" observedRunningTime="2026-04-20 14:56:14.188223981 +0000 UTC m=+164.006849526" watchObservedRunningTime="2026-04-20 14:56:14.188801501 +0000 UTC m=+164.007427044" Apr 20 14:56:18.901535 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:18.901475 2538 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" podUID="9a1565e7-1ef9-4f8a-8bd8-d45090a0e622" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/readyz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 20 14:56:19.187580 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.187503 2538 generic.go:358] "Generic (PLEG): container finished" podID="9a1565e7-1ef9-4f8a-8bd8-d45090a0e622" containerID="2c19b93191992070d40a73afc2009015cf43bd0d772ce5f9f0a75e6ca5c55b78" exitCode=1 Apr 20 14:56:19.187714 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.187577 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" event={"ID":"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622","Type":"ContainerDied","Data":"2c19b93191992070d40a73afc2009015cf43bd0d772ce5f9f0a75e6ca5c55b78"} Apr 20 14:56:19.187967 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.187940 2538 scope.go:117] "RemoveContainer" containerID="2c19b93191992070d40a73afc2009015cf43bd0d772ce5f9f0a75e6ca5c55b78" Apr 20 14:56:19.188903 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.188879 2538 generic.go:358] "Generic (PLEG): container finished" podID="5d56359b-6635-40d6-beac-7e5a10cd4de1" containerID="9db9d8fc8c3c3ffcc006a0d0929b37c51a44072977f4ef70167c591c21e6149c" exitCode=255 Apr 20 14:56:19.189013 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.188926 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" event={"ID":"5d56359b-6635-40d6-beac-7e5a10cd4de1","Type":"ContainerDied","Data":"9db9d8fc8c3c3ffcc006a0d0929b37c51a44072977f4ef70167c591c21e6149c"} Apr 20 14:56:19.189362 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.189194 2538 scope.go:117] "RemoveContainer" containerID="9db9d8fc8c3c3ffcc006a0d0929b37c51a44072977f4ef70167c591c21e6149c" Apr 20 14:56:19.701198 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:19.701162 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:56:20.196224 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:20.196183 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" event={"ID":"9a1565e7-1ef9-4f8a-8bd8-d45090a0e622","Type":"ContainerStarted","Data":"b673d031b69b33f4b79701f2cafc437c111f6fca76c9d41064522eaaf73d847f"} Apr 20 14:56:20.196685 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:20.196522 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:56:20.197175 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:20.197153 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc94964d5-tp4rx" Apr 20 14:56:20.197754 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:20.197726 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bff9cd4dc-7v5lh" event={"ID":"5d56359b-6635-40d6-beac-7e5a10cd4de1","Type":"ContainerStarted","Data":"0d686d7bfaddd40579222f87be77fbeedcbcf23ccb6d525939c408e3f4ac2eed"} Apr 20 14:56:22.701110 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:22.701072 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:56:22.704604 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:22.704578 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:56:22.711615 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:22.711598 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw4l7" Apr 20 14:56:22.826931 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:22.826899 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kw4l7"] Apr 20 14:56:22.830914 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:56:22.830887 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a05c11_ab26_4873_920e_803fdbe14912.slice/crio-3c9e419a9146c6f13990c8702e91dcbb8e1f026d1ce15a59542694872c640927 WatchSource:0}: Error finding container 3c9e419a9146c6f13990c8702e91dcbb8e1f026d1ce15a59542694872c640927: Status 404 returned error can't find the container with id 3c9e419a9146c6f13990c8702e91dcbb8e1f026d1ce15a59542694872c640927 Apr 20 14:56:23.207705 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:23.207669 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kw4l7" event={"ID":"50a05c11-ab26-4873-920e-803fdbe14912","Type":"ContainerStarted","Data":"3c9e419a9146c6f13990c8702e91dcbb8e1f026d1ce15a59542694872c640927"} Apr 20 14:56:24.178660 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:24.178629 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v5pr2" Apr 20 14:56:25.166377 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.164587 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qdbm4"] Apr 20 14:56:25.168745 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.168717 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.171731 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.171709 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:25.173158 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.173137 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:56:25.173238 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.173137 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:56:25.173498 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.173482 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z4lp2\"" Apr 20 14:56:25.173563 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.173483 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:25.181613 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.181588 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qdbm4"] Apr 20 14:56:25.214953 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.214925 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kw4l7" event={"ID":"50a05c11-ab26-4873-920e-803fdbe14912","Type":"ContainerStarted","Data":"2a34d87a2b05793b96f597bb6d646b2385d28bd4fcfc5693f5330efbfa41cd49"} Apr 20 14:56:25.242202 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.242155 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kw4l7" podStartSLOduration=139.787362495 podStartE2EDuration="2m21.242141883s" podCreationTimestamp="2026-04-20 14:54:04 +0000 UTC" firstStartedPulling="2026-04-20 14:56:22.833324318 +0000 UTC m=+172.651949846" lastFinishedPulling="2026-04-20 14:56:24.288103709 +0000 UTC m=+174.106729234" observedRunningTime="2026-04-20 14:56:25.241314672 +0000 UTC m=+175.059940219" watchObservedRunningTime="2026-04-20 14:56:25.242141883 +0000 UTC m=+175.060767462" Apr 20 14:56:25.251873 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.251843 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.251998 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.251877 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b12aef5-3be9-4fe2-909b-17d214356c6c-data-volume\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.252052 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.251987 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b12aef5-3be9-4fe2-909b-17d214356c6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.252052 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.252035 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b12aef5-3be9-4fe2-909b-17d214356c6c-crio-socket\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.252143 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.252070 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4lpt\" (UniqueName: \"kubernetes.io/projected/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-api-access-f4lpt\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.352920 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.352883 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b12aef5-3be9-4fe2-909b-17d214356c6c-data-volume\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.352938 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b12aef5-3be9-4fe2-909b-17d214356c6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.352966 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b12aef5-3be9-4fe2-909b-17d214356c6c-crio-socket\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.352992 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4lpt\" (UniqueName: \"kubernetes.io/projected/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-api-access-f4lpt\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.353061 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353106 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.353098 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b12aef5-3be9-4fe2-909b-17d214356c6c-crio-socket\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353435 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.353219 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b12aef5-3be9-4fe2-909b-17d214356c6c-data-volume\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.353634 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.353614 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.355198 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.355182 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b12aef5-3be9-4fe2-909b-17d214356c6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.362230 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.362211 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4lpt\" (UniqueName: \"kubernetes.io/projected/2b12aef5-3be9-4fe2-909b-17d214356c6c-kube-api-access-f4lpt\") pod \"insights-runtime-extractor-qdbm4\" (UID: \"2b12aef5-3be9-4fe2-909b-17d214356c6c\") " pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.477415 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.477320 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qdbm4" Apr 20 14:56:25.591972 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:25.591942 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qdbm4"] Apr 20 14:56:25.595310 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:56:25.595282 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b12aef5_3be9_4fe2_909b_17d214356c6c.slice/crio-26451b55eb783d6fdd3d9b8ac038657440caa4d58fa5d13a0d5508c9dcf51b86 WatchSource:0}: Error finding container 26451b55eb783d6fdd3d9b8ac038657440caa4d58fa5d13a0d5508c9dcf51b86: Status 404 returned error can't find the container with id 26451b55eb783d6fdd3d9b8ac038657440caa4d58fa5d13a0d5508c9dcf51b86 Apr 20 14:56:26.218643 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:26.218604 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdbm4" event={"ID":"2b12aef5-3be9-4fe2-909b-17d214356c6c","Type":"ContainerStarted","Data":"9039f42007991b58533a001f6cdc0cf4a6ca7a160564930af968b786e0a7c7e5"} Apr 20 14:56:26.218643 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:26.218644 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdbm4" event={"ID":"2b12aef5-3be9-4fe2-909b-17d214356c6c","Type":"ContainerStarted","Data":"26451b55eb783d6fdd3d9b8ac038657440caa4d58fa5d13a0d5508c9dcf51b86"} Apr 20 14:56:27.222943 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:27.222907 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdbm4" event={"ID":"2b12aef5-3be9-4fe2-909b-17d214356c6c","Type":"ContainerStarted","Data":"0ed20b437602d7565b91925fee3d48cf50650dc19956937b0c66e53c360306f2"} Apr 20 14:56:28.227404 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:28.227368 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdbm4" event={"ID":"2b12aef5-3be9-4fe2-909b-17d214356c6c","Type":"ContainerStarted","Data":"d04cf48ac63537045fe9e26c91028a16f3bede0a9a9e7062d384b833711975f7"} Apr 20 14:56:28.245176 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:28.245127 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qdbm4" podStartSLOduration=1.353845804 podStartE2EDuration="3.245114066s" podCreationTimestamp="2026-04-20 14:56:25 +0000 UTC" firstStartedPulling="2026-04-20 14:56:25.662547814 +0000 UTC m=+175.481173335" lastFinishedPulling="2026-04-20 14:56:27.553816072 +0000 UTC m=+177.372441597" observedRunningTime="2026-04-20 14:56:28.243665685 +0000 UTC m=+178.062291229" watchObservedRunningTime="2026-04-20 14:56:28.245114066 +0000 UTC m=+178.063739609" Apr 20 14:56:31.867183 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.867147 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtbzh"] Apr 20 14:56:31.870398 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.870379 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.872903 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.872877 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:31.874062 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874038 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:56:31.874163 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874063 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2xwxs\"" Apr 20 14:56:31.874163 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874137 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:31.874253 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874181 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:56:31.874253 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874243 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:31.874364 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.874332 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:31.899950 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.899924 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzj7\" (UniqueName: \"kubernetes.io/projected/df1a19f5-b9af-4b67-a68b-7d07365aeefa-kube-api-access-fkzj7\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900053 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.899963 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-root\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900053 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.899988 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-sys\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900053 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900036 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-wtmp\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900144 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900056 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900144 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900074 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-textfile\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900209 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900154 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900209 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900198 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:31.900272 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:31.900222 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-metrics-client-ca\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000531 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000498 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000531 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000534 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-metrics-client-ca\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000558 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzj7\" (UniqueName: \"kubernetes.io/projected/df1a19f5-b9af-4b67-a68b-7d07365aeefa-kube-api-access-fkzj7\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000581 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-root\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000614 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-sys\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000640 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-wtmp\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000658 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:56:32.000671 2538 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000686 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-textfile\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000723 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-sys\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000734 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.000772 ip-10-0-142-255 kubenswrapper[2538]: E0420 14:56:32.000752 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls podName:df1a19f5-b9af-4b67-a68b-7d07365aeefa nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.50072911 +0000 UTC m=+182.319354634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls") pod "node-exporter-xtbzh" (UID: "df1a19f5-b9af-4b67-a68b-7d07365aeefa") : secret "node-exporter-tls" not found Apr 20 14:56:32.001141 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000781 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-root\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.001141 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.000820 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-wtmp\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.001141 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.001070 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-textfile\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.001243 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.001224 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-metrics-client-ca\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.001291 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.001275 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.002922 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.002901 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.012173 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.012144 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzj7\" (UniqueName: \"kubernetes.io/projected/df1a19f5-b9af-4b67-a68b-7d07365aeefa-kube-api-access-fkzj7\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.505222 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.505161 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.507504 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.507483 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/df1a19f5-b9af-4b67-a68b-7d07365aeefa-node-exporter-tls\") pod \"node-exporter-xtbzh\" (UID: \"df1a19f5-b9af-4b67-a68b-7d07365aeefa\") " pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.779652 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:32.779568 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtbzh" Apr 20 14:56:32.789568 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:56:32.789533 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf1a19f5_b9af_4b67_a68b_7d07365aeefa.slice/crio-2adde4c4f2a2b6d689ce859561de4d4ebaefa62279b15b985ba0f3cd27d9bb1c WatchSource:0}: Error finding container 2adde4c4f2a2b6d689ce859561de4d4ebaefa62279b15b985ba0f3cd27d9bb1c: Status 404 returned error can't find the container with id 2adde4c4f2a2b6d689ce859561de4d4ebaefa62279b15b985ba0f3cd27d9bb1c Apr 20 14:56:33.241055 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:33.241021 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtbzh" event={"ID":"df1a19f5-b9af-4b67-a68b-7d07365aeefa","Type":"ContainerStarted","Data":"2adde4c4f2a2b6d689ce859561de4d4ebaefa62279b15b985ba0f3cd27d9bb1c"} Apr 20 14:56:34.244582 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:34.244547 2538 generic.go:358] "Generic (PLEG): container finished" podID="df1a19f5-b9af-4b67-a68b-7d07365aeefa" containerID="78020524bd0cd21f3aa60ba91f6d355f4b6384dd85c6d60f804a917145a6dac8" exitCode=0 Apr 20 14:56:34.244963 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:34.244588 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtbzh" event={"ID":"df1a19f5-b9af-4b67-a68b-7d07365aeefa","Type":"ContainerDied","Data":"78020524bd0cd21f3aa60ba91f6d355f4b6384dd85c6d60f804a917145a6dac8"} Apr 20 14:56:35.248879 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:35.248837 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtbzh" event={"ID":"df1a19f5-b9af-4b67-a68b-7d07365aeefa","Type":"ContainerStarted","Data":"3e19e57423c65a21e6fa902de4dfa55c74b050fd23720a7736e78a67af4ade9c"} Apr 20 14:56:35.248879 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:35.248883 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtbzh" event={"ID":"df1a19f5-b9af-4b67-a68b-7d07365aeefa","Type":"ContainerStarted","Data":"40c22f1c91fcc3217e43031f203f4250416a0bbaa8764a5036a23a1ad8a2348c"} Apr 20 14:56:35.268124 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:35.268080 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtbzh" podStartSLOduration=3.467368666 podStartE2EDuration="4.268065797s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:32.791305729 +0000 UTC m=+182.609931255" lastFinishedPulling="2026-04-20 14:56:33.592002865 +0000 UTC m=+183.410628386" observedRunningTime="2026-04-20 14:56:35.266819747 +0000 UTC m=+185.085445289" watchObservedRunningTime="2026-04-20 14:56:35.268065797 +0000 UTC m=+185.086691340" Apr 20 14:56:37.039658 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.039627 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-667b67845d-t24gr"] Apr 20 14:56:37.042642 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.042613 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.045153 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.045132 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 14:56:37.045399 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.045375 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 14:56:37.045509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.045404 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-srrn8\"" Apr 20 14:56:37.045509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.045440 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 14:56:37.045509 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.045410 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 14:56:37.046431 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.046411 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 14:56:37.051170 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.051149 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 14:56:37.052818 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.052790 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-667b67845d-t24gr"] Apr 20 14:56:37.139092 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139039 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-federate-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139131 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-metrics-client-ca\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139189 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139214 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139235 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139260 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139256 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhv49\" (UniqueName: \"kubernetes.io/projected/34da4e27-67e3-476b-9fe2-5389ceef268e-kube-api-access-zhv49\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139468 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139282 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.139468 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.139319 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-serving-certs-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.239839 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239799 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-federate-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.239839 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239840 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-metrics-client-ca\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239861 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239883 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239901 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239918 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhv49\" (UniqueName: \"kubernetes.io/projected/34da4e27-67e3-476b-9fe2-5389ceef268e-kube-api-access-zhv49\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239936 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.240035 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.239978 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-serving-certs-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.241034 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.241000 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-serving-certs-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.241161 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.241051 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-metrics-client-ca\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.242369 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.241296 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.243840 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.243181 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.243840 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.243773 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-telemeter-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.246174 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.246148 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-federate-client-tls\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.247286 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.247264 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/34da4e27-67e3-476b-9fe2-5389ceef268e-secret-telemeter-client\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.248815 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.248791 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhv49\" (UniqueName: \"kubernetes.io/projected/34da4e27-67e3-476b-9fe2-5389ceef268e-kube-api-access-zhv49\") pod \"telemeter-client-667b67845d-t24gr\" (UID: \"34da4e27-67e3-476b-9fe2-5389ceef268e\") " pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.351496 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.351459 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" Apr 20 14:56:37.471979 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:37.471952 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-667b67845d-t24gr"] Apr 20 14:56:37.472991 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:56:37.472959 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34da4e27_67e3_476b_9fe2_5389ceef268e.slice/crio-5e7e387e3dbad4c3a1d44b40d2e3cddfb87cc2c0803b361aa09610d0611ef2e7 WatchSource:0}: Error finding container 5e7e387e3dbad4c3a1d44b40d2e3cddfb87cc2c0803b361aa09610d0611ef2e7: Status 404 returned error can't find the container with id 5e7e387e3dbad4c3a1d44b40d2e3cddfb87cc2c0803b361aa09610d0611ef2e7 Apr 20 14:56:38.257903 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:38.257864 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" event={"ID":"34da4e27-67e3-476b-9fe2-5389ceef268e","Type":"ContainerStarted","Data":"5e7e387e3dbad4c3a1d44b40d2e3cddfb87cc2c0803b361aa09610d0611ef2e7"} Apr 20 14:56:40.264625 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:40.264591 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" event={"ID":"34da4e27-67e3-476b-9fe2-5389ceef268e","Type":"ContainerStarted","Data":"706bbdf9d825d1a6cba9fd50327d0cff9443551cf510e2b4d3b0d3af0dfd8d57"} Apr 20 14:56:41.268771 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:41.268732 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" event={"ID":"34da4e27-67e3-476b-9fe2-5389ceef268e","Type":"ContainerStarted","Data":"8c86e74f891c0d4bf644004444159839ec4e14754779ba0352b0bcb4fe2658ef"} Apr 20 14:56:41.268771 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:41.268774 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" event={"ID":"34da4e27-67e3-476b-9fe2-5389ceef268e","Type":"ContainerStarted","Data":"c60c2091ac5280a3bdd59e973a892927902d33fb0dcd5c89deb0b32d28e51bfc"} Apr 20 14:56:41.298319 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:56:41.298267 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-667b67845d-t24gr" podStartSLOduration=1.419160995 podStartE2EDuration="4.298252085s" podCreationTimestamp="2026-04-20 14:56:37 +0000 UTC" firstStartedPulling="2026-04-20 14:56:37.474900803 +0000 UTC m=+187.293526324" lastFinishedPulling="2026-04-20 14:56:40.35399189 +0000 UTC m=+190.172617414" observedRunningTime="2026-04-20 14:56:41.297200917 +0000 UTC m=+191.115826459" watchObservedRunningTime="2026-04-20 14:56:41.298252085 +0000 UTC m=+191.116877625" Apr 20 14:57:01.709158 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:01.709116 2538 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" podUID="5927be3a-bd47-4737-bf39-2788500862ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:11.709923 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:11.709877 2538 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" podUID="5927be3a-bd47-4737-bf39-2788500862ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:21.709742 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:21.709703 2538 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" podUID="5927be3a-bd47-4737-bf39-2788500862ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:21.710119 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:21.709785 2538 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" Apr 20 14:57:21.710275 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:21.710245 2538 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"4a2d8e59cb75790661db5b6dcdf00868ce814dd46af7d3903ab42c51efa43f21"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 14:57:21.710319 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:21.710307 2538 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" podUID="5927be3a-bd47-4737-bf39-2788500862ce" containerName="service-proxy" containerID="cri-o://4a2d8e59cb75790661db5b6dcdf00868ce814dd46af7d3903ab42c51efa43f21" gracePeriod=30 Apr 20 14:57:22.373145 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:22.373111 2538 generic.go:358] "Generic (PLEG): container finished" podID="5927be3a-bd47-4737-bf39-2788500862ce" containerID="4a2d8e59cb75790661db5b6dcdf00868ce814dd46af7d3903ab42c51efa43f21" exitCode=2 Apr 20 14:57:22.373365 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:22.373181 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerDied","Data":"4a2d8e59cb75790661db5b6dcdf00868ce814dd46af7d3903ab42c51efa43f21"} Apr 20 14:57:22.373365 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:22.373222 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76f5bb6df5-6zq9x" event={"ID":"5927be3a-bd47-4737-bf39-2788500862ce","Type":"ContainerStarted","Data":"9f217028f4bb91ad4316bc2e64d82f296a0207eabc5cb31324a090cb659a9a71"} Apr 20 14:57:42.632458 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:42.632418 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:57:42.634731 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:42.634710 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5752b3ca-4688-4db8-9995-af78bc6f30d3-metrics-certs\") pod \"network-metrics-daemon-fdmrj\" (UID: \"5752b3ca-4688-4db8-9995-af78bc6f30d3\") " pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:57:42.805236 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:42.805199 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:57:42.812750 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:42.812729 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdmrj" Apr 20 14:57:42.928592 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:42.928563 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fdmrj"] Apr 20 14:57:42.931511 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:57:42.931469 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5752b3ca_4688_4db8_9995_af78bc6f30d3.slice/crio-98c5786e4fe46dfc068441e0a25e9dd6f2496765b0e300541bb81ff82b960ed2 WatchSource:0}: Error finding container 98c5786e4fe46dfc068441e0a25e9dd6f2496765b0e300541bb81ff82b960ed2: Status 404 returned error can't find the container with id 98c5786e4fe46dfc068441e0a25e9dd6f2496765b0e300541bb81ff82b960ed2 Apr 20 14:57:43.429893 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:43.429853 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdmrj" event={"ID":"5752b3ca-4688-4db8-9995-af78bc6f30d3","Type":"ContainerStarted","Data":"98c5786e4fe46dfc068441e0a25e9dd6f2496765b0e300541bb81ff82b960ed2"} Apr 20 14:57:44.433421 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:44.433337 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdmrj" event={"ID":"5752b3ca-4688-4db8-9995-af78bc6f30d3","Type":"ContainerStarted","Data":"eb05c390fb3b21cd8691a9db9f92b2f26320098bc70e91407fab362770b98c55"} Apr 20 14:57:44.433812 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:44.433425 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdmrj" event={"ID":"5752b3ca-4688-4db8-9995-af78bc6f30d3","Type":"ContainerStarted","Data":"e576fef9298b14b8fc0338c2dcf9d23e2e61f7c31a37f83dbf23d8757a0284d2"} Apr 20 14:57:44.448810 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:57:44.448764 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fdmrj" podStartSLOduration=253.44468953 podStartE2EDuration="4m14.448749106s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:57:42.933233296 +0000 UTC m=+252.751858818" lastFinishedPulling="2026-04-20 14:57:43.93729286 +0000 UTC m=+253.755918394" observedRunningTime="2026-04-20 14:57:44.44777826 +0000 UTC m=+254.266403848" watchObservedRunningTime="2026-04-20 14:57:44.448749106 +0000 UTC m=+254.267374648" Apr 20 14:58:30.619222 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:58:30.619187 2538 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 14:59:31.621681 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.621642 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm"] Apr 20 14:59:31.624559 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.624544 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.627295 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.627269 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:59:31.627433 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.627280 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-jx6jt\"" Apr 20 14:59:31.628396 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.628379 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 14:59:31.631390 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.631102 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm"] Apr 20 14:59:31.705302 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.705272 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dac5d4d-7074-4cc8-a214-c276e9876766-tmp\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.705302 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.705307 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nx6\" (UniqueName: \"kubernetes.io/projected/7dac5d4d-7074-4cc8-a214-c276e9876766-kube-api-access-b2nx6\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.805657 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.805614 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dac5d4d-7074-4cc8-a214-c276e9876766-tmp\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.805657 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.805657 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nx6\" (UniqueName: \"kubernetes.io/projected/7dac5d4d-7074-4cc8-a214-c276e9876766-kube-api-access-b2nx6\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.806014 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.805995 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dac5d4d-7074-4cc8-a214-c276e9876766-tmp\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.813316 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.813284 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nx6\" (UniqueName: \"kubernetes.io/projected/7dac5d4d-7074-4cc8-a214-c276e9876766-kube-api-access-b2nx6\") pod \"openshift-lws-operator-bfc7f696d-sc7tm\" (UID: \"7dac5d4d-7074-4cc8-a214-c276e9876766\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:31.933912 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:31.933824 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" Apr 20 14:59:32.051702 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:32.051677 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm"] Apr 20 14:59:32.054725 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:59:32.054694 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dac5d4d_7074_4cc8_a214_c276e9876766.slice/crio-6fd0a1169e61388795e0965f5f5a4e214fbd4083fc5fae43bb211c52bf8d5de1 WatchSource:0}: Error finding container 6fd0a1169e61388795e0965f5f5a4e214fbd4083fc5fae43bb211c52bf8d5de1: Status 404 returned error can't find the container with id 6fd0a1169e61388795e0965f5f5a4e214fbd4083fc5fae43bb211c52bf8d5de1 Apr 20 14:59:32.056156 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:32.056140 2538 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:59:32.714896 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:32.714865 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" event={"ID":"7dac5d4d-7074-4cc8-a214-c276e9876766","Type":"ContainerStarted","Data":"6fd0a1169e61388795e0965f5f5a4e214fbd4083fc5fae43bb211c52bf8d5de1"} Apr 20 14:59:35.731563 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:35.731523 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" event={"ID":"7dac5d4d-7074-4cc8-a214-c276e9876766","Type":"ContainerStarted","Data":"6fb590843e222a9dac74510ff355cbc7be9481138360bf61804210d37acda957"} Apr 20 14:59:35.747021 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:35.746963 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sc7tm" podStartSLOduration=2.133301052 podStartE2EDuration="4.746943936s" podCreationTimestamp="2026-04-20 14:59:31 +0000 UTC" firstStartedPulling="2026-04-20 14:59:32.056262338 +0000 UTC m=+361.874887859" lastFinishedPulling="2026-04-20 14:59:34.669905209 +0000 UTC m=+364.488530743" observedRunningTime="2026-04-20 14:59:35.746407404 +0000 UTC m=+365.565032949" watchObservedRunningTime="2026-04-20 14:59:35.746943936 +0000 UTC m=+365.565569482" Apr 20 14:59:51.065863 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.065823 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5"] Apr 20 14:59:51.068909 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.068885 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.071580 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.071552 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:59:51.071706 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.071611 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:59:51.071706 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.071614 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:59:51.071942 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.071925 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-525vc\"" Apr 20 14:59:51.071942 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.071936 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:59:51.083717 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.083692 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5"] Apr 20 14:59:51.153247 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.153216 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.153247 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.153252 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgqr\" (UniqueName: \"kubernetes.io/projected/4997ee7e-4a5c-445a-b713-94077b5e7f2d-kube-api-access-mrgqr\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.153474 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.153277 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.254400 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.254318 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.254587 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.254445 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.254587 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.254471 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgqr\" (UniqueName: \"kubernetes.io/projected/4997ee7e-4a5c-445a-b713-94077b5e7f2d-kube-api-access-mrgqr\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.256984 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.256949 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.257094 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.256949 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4997ee7e-4a5c-445a-b713-94077b5e7f2d-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.262642 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.262620 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgqr\" (UniqueName: \"kubernetes.io/projected/4997ee7e-4a5c-445a-b713-94077b5e7f2d-kube-api-access-mrgqr\") pod \"opendatahub-operator-controller-manager-854569cf8c-zflg5\" (UID: \"4997ee7e-4a5c-445a-b713-94077b5e7f2d\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.378410 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.378371 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:51.501129 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.501094 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5"] Apr 20 14:59:51.504922 ip-10-0-142-255 kubenswrapper[2538]: W0420 14:59:51.504895 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4997ee7e_4a5c_445a_b713_94077b5e7f2d.slice/crio-7578b6d5543e43e454ca5ff452550b8b8bafd08d201ef80c061f900b3eed960a WatchSource:0}: Error finding container 7578b6d5543e43e454ca5ff452550b8b8bafd08d201ef80c061f900b3eed960a: Status 404 returned error can't find the container with id 7578b6d5543e43e454ca5ff452550b8b8bafd08d201ef80c061f900b3eed960a Apr 20 14:59:51.773745 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:51.773654 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" event={"ID":"4997ee7e-4a5c-445a-b713-94077b5e7f2d","Type":"ContainerStarted","Data":"7578b6d5543e43e454ca5ff452550b8b8bafd08d201ef80c061f900b3eed960a"} Apr 20 14:59:54.782629 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:54.782591 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" event={"ID":"4997ee7e-4a5c-445a-b713-94077b5e7f2d","Type":"ContainerStarted","Data":"caf3a0ab22d856fcdbc8b3c570a8e767bf3b915fa3f7b4d833f471389df037e2"} Apr 20 14:59:54.783067 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:54.782709 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 14:59:54.808978 ip-10-0-142-255 kubenswrapper[2538]: I0420 14:59:54.808920 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" podStartSLOduration=1.321405389 podStartE2EDuration="3.808899588s" podCreationTimestamp="2026-04-20 14:59:51 +0000 UTC" firstStartedPulling="2026-04-20 14:59:51.506841922 +0000 UTC m=+381.325467443" lastFinishedPulling="2026-04-20 14:59:53.994336118 +0000 UTC m=+383.812961642" observedRunningTime="2026-04-20 14:59:54.807058181 +0000 UTC m=+384.625683723" watchObservedRunningTime="2026-04-20 14:59:54.808899588 +0000 UTC m=+384.627525131" Apr 20 15:00:05.787254 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:05.787222 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zflg5" Apr 20 15:00:08.631122 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.631080 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn"] Apr 20 15:00:08.649109 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.649076 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn"] Apr 20 15:00:08.649263 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.649197 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.652013 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.651991 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:00:08.653509 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.653288 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:00:08.653509 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.653298 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:00:08.653509 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.653329 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:00:08.653509 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.653298 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m2765\"" Apr 20 15:00:08.791076 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.791045 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q42\" (UniqueName: \"kubernetes.io/projected/29878348-f049-4527-b442-17692aa7a4f3-kube-api-access-m5q42\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.791270 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.791088 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29878348-f049-4527-b442-17692aa7a4f3-tmp\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.791270 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.791123 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29878348-f049-4527-b442-17692aa7a4f3-tls-certs\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.892385 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.892266 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q42\" (UniqueName: \"kubernetes.io/projected/29878348-f049-4527-b442-17692aa7a4f3-kube-api-access-m5q42\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.892385 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.892331 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29878348-f049-4527-b442-17692aa7a4f3-tmp\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.892627 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.892427 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29878348-f049-4527-b442-17692aa7a4f3-tls-certs\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.894575 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.894540 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29878348-f049-4527-b442-17692aa7a4f3-tmp\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.894794 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.894774 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29878348-f049-4527-b442-17692aa7a4f3-tls-certs\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.903232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.903192 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q42\" (UniqueName: \"kubernetes.io/projected/29878348-f049-4527-b442-17692aa7a4f3-kube-api-access-m5q42\") pod \"kube-auth-proxy-597dfdc786-zp9qn\" (UID: \"29878348-f049-4527-b442-17692aa7a4f3\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:08.960067 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:08.960031 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" Apr 20 15:00:09.078013 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:09.077868 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn"] Apr 20 15:00:09.080632 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:00:09.080607 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29878348_f049_4527_b442_17692aa7a4f3.slice/crio-da06ae40e5677588cc18886dc5bea53db3345db8003ed5b33be493d77f2e5be1 WatchSource:0}: Error finding container da06ae40e5677588cc18886dc5bea53db3345db8003ed5b33be493d77f2e5be1: Status 404 returned error can't find the container with id da06ae40e5677588cc18886dc5bea53db3345db8003ed5b33be493d77f2e5be1 Apr 20 15:00:09.826702 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:09.826653 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" event={"ID":"29878348-f049-4527-b442-17692aa7a4f3","Type":"ContainerStarted","Data":"da06ae40e5677588cc18886dc5bea53db3345db8003ed5b33be493d77f2e5be1"} Apr 20 15:00:11.830982 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.830946 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bbdhj"] Apr 20 15:00:11.833166 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.833145 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:11.835748 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.835719 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-gjhr7\"" Apr 20 15:00:11.835872 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.835727 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 15:00:11.841557 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.841531 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bbdhj"] Apr 20 15:00:11.919708 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.919669 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jc29\" (UniqueName: \"kubernetes.io/projected/f84924d4-96fb-4afc-bde7-04da5bb85ad9-kube-api-access-8jc29\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:11.919708 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:11.919710 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:12.021008 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:12.020970 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jc29\" (UniqueName: \"kubernetes.io/projected/f84924d4-96fb-4afc-bde7-04da5bb85ad9-kube-api-access-8jc29\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:12.021008 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:12.021011 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:12.021203 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:12.021129 2538 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 15:00:12.021203 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:12.021182 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert podName:f84924d4-96fb-4afc-bde7-04da5bb85ad9 nodeName:}" failed. No retries permitted until 2026-04-20 15:00:12.521165377 +0000 UTC m=+402.339790897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert") pod "odh-model-controller-858dbf95b8-bbdhj" (UID: "f84924d4-96fb-4afc-bde7-04da5bb85ad9") : secret "odh-model-controller-webhook-cert" not found Apr 20 15:00:12.029864 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:12.029834 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jc29\" (UniqueName: \"kubernetes.io/projected/f84924d4-96fb-4afc-bde7-04da5bb85ad9-kube-api-access-8jc29\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:12.525445 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:12.525406 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:12.525632 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:12.525606 2538 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 15:00:12.525711 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:12.525688 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert podName:f84924d4-96fb-4afc-bde7-04da5bb85ad9 nodeName:}" failed. No retries permitted until 2026-04-20 15:00:13.525665646 +0000 UTC m=+403.344291167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert") pod "odh-model-controller-858dbf95b8-bbdhj" (UID: "f84924d4-96fb-4afc-bde7-04da5bb85ad9") : secret "odh-model-controller-webhook-cert" not found Apr 20 15:00:13.532488 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.532450 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:13.534875 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.534855 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84924d4-96fb-4afc-bde7-04da5bb85ad9-cert\") pod \"odh-model-controller-858dbf95b8-bbdhj\" (UID: \"f84924d4-96fb-4afc-bde7-04da5bb85ad9\") " pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:13.644366 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.644307 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:13.762335 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.762311 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bbdhj"] Apr 20 15:00:13.764670 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:00:13.764644 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84924d4_96fb_4afc_bde7_04da5bb85ad9.slice/crio-096abb23b23d77fe7fa58685957581757f85ac5e59f772038973ed0fcdf7b0a1 WatchSource:0}: Error finding container 096abb23b23d77fe7fa58685957581757f85ac5e59f772038973ed0fcdf7b0a1: Status 404 returned error can't find the container with id 096abb23b23d77fe7fa58685957581757f85ac5e59f772038973ed0fcdf7b0a1 Apr 20 15:00:13.840324 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.840281 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" event={"ID":"f84924d4-96fb-4afc-bde7-04da5bb85ad9","Type":"ContainerStarted","Data":"096abb23b23d77fe7fa58685957581757f85ac5e59f772038973ed0fcdf7b0a1"} Apr 20 15:00:13.841442 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.841417 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" event={"ID":"29878348-f049-4527-b442-17692aa7a4f3","Type":"ContainerStarted","Data":"ff9cf3d38bd1ae51b5088c58a88bf68ecf5f5a0ceadd8ff12612b36d364ad632"} Apr 20 15:00:13.857307 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:13.857260 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-597dfdc786-zp9qn" podStartSLOduration=2.087022206 podStartE2EDuration="5.857245812s" podCreationTimestamp="2026-04-20 15:00:08 +0000 UTC" firstStartedPulling="2026-04-20 15:00:09.082411389 +0000 UTC m=+398.901036910" lastFinishedPulling="2026-04-20 15:00:12.852634995 +0000 UTC m=+402.671260516" observedRunningTime="2026-04-20 15:00:13.855971179 +0000 UTC m=+403.674596723" watchObservedRunningTime="2026-04-20 15:00:13.857245812 +0000 UTC m=+403.675871355" Apr 20 15:00:16.851123 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:16.851091 2538 generic.go:358] "Generic (PLEG): container finished" podID="f84924d4-96fb-4afc-bde7-04da5bb85ad9" containerID="11a2562f60cb28f425b1e4fa90a8b7bf369b07dbe05f23526f1a86444de22171" exitCode=1 Apr 20 15:00:16.851452 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:16.851176 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" event={"ID":"f84924d4-96fb-4afc-bde7-04da5bb85ad9","Type":"ContainerDied","Data":"11a2562f60cb28f425b1e4fa90a8b7bf369b07dbe05f23526f1a86444de22171"} Apr 20 15:00:16.851452 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:16.851403 2538 scope.go:117] "RemoveContainer" containerID="11a2562f60cb28f425b1e4fa90a8b7bf369b07dbe05f23526f1a86444de22171" Apr 20 15:00:17.855230 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:17.855190 2538 generic.go:358] "Generic (PLEG): container finished" podID="f84924d4-96fb-4afc-bde7-04da5bb85ad9" containerID="756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24" exitCode=1 Apr 20 15:00:17.855719 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:17.855268 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" event={"ID":"f84924d4-96fb-4afc-bde7-04da5bb85ad9","Type":"ContainerDied","Data":"756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24"} Apr 20 15:00:17.855719 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:17.855311 2538 scope.go:117] "RemoveContainer" containerID="11a2562f60cb28f425b1e4fa90a8b7bf369b07dbe05f23526f1a86444de22171" Apr 20 15:00:17.855719 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:17.855569 2538 scope.go:117] "RemoveContainer" containerID="756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24" Apr 20 15:00:17.855882 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:17.855793 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bbdhj_opendatahub(f84924d4-96fb-4afc-bde7-04da5bb85ad9)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" podUID="f84924d4-96fb-4afc-bde7-04da5bb85ad9" Apr 20 15:00:18.116720 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.116636 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-8jv28"] Apr 20 15:00:18.118819 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.118797 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.121410 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.121390 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 15:00:18.121576 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.121559 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-bwsg6\"" Apr 20 15:00:18.129365 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.129300 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-8jv28"] Apr 20 15:00:18.268305 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.268265 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fd8x\" (UniqueName: \"kubernetes.io/projected/c82f1c0e-3b58-49a9-90bd-72e4090803d6-kube-api-access-5fd8x\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.268504 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.268369 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.368932 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.368844 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.368932 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.368899 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fd8x\" (UniqueName: \"kubernetes.io/projected/c82f1c0e-3b58-49a9-90bd-72e4090803d6-kube-api-access-5fd8x\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.369091 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:18.368998 2538 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 15:00:18.369091 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:18.369063 2538 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert podName:c82f1c0e-3b58-49a9-90bd-72e4090803d6 nodeName:}" failed. No retries permitted until 2026-04-20 15:00:18.869046797 +0000 UTC m=+408.687672318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert") pod "kserve-controller-manager-856948b99f-8jv28" (UID: "c82f1c0e-3b58-49a9-90bd-72e4090803d6") : secret "kserve-webhook-server-cert" not found Apr 20 15:00:18.382990 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.382955 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fd8x\" (UniqueName: \"kubernetes.io/projected/c82f1c0e-3b58-49a9-90bd-72e4090803d6-kube-api-access-5fd8x\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.859661 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.859635 2538 scope.go:117] "RemoveContainer" containerID="756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24" Apr 20 15:00:18.860013 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:18.859801 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bbdhj_opendatahub(f84924d4-96fb-4afc-bde7-04da5bb85ad9)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" podUID="f84924d4-96fb-4afc-bde7-04da5bb85ad9" Apr 20 15:00:18.873906 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.873872 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:18.876175 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:18.876152 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c82f1c0e-3b58-49a9-90bd-72e4090803d6-cert\") pod \"kserve-controller-manager-856948b99f-8jv28\" (UID: \"c82f1c0e-3b58-49a9-90bd-72e4090803d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:19.030320 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:19.030285 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:19.149264 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:19.149239 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-8jv28"] Apr 20 15:00:19.151258 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:00:19.151235 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82f1c0e_3b58_49a9_90bd_72e4090803d6.slice/crio-0b505b648862009a2e5f480c1183df8702fe3e4d18f1f33fbc3a4c3e55ec9d76 WatchSource:0}: Error finding container 0b505b648862009a2e5f480c1183df8702fe3e4d18f1f33fbc3a4c3e55ec9d76: Status 404 returned error can't find the container with id 0b505b648862009a2e5f480c1183df8702fe3e4d18f1f33fbc3a4c3e55ec9d76 Apr 20 15:00:19.862885 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:19.862847 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" event={"ID":"c82f1c0e-3b58-49a9-90bd-72e4090803d6","Type":"ContainerStarted","Data":"0b505b648862009a2e5f480c1183df8702fe3e4d18f1f33fbc3a4c3e55ec9d76"} Apr 20 15:00:22.873598 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:22.873561 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" event={"ID":"c82f1c0e-3b58-49a9-90bd-72e4090803d6","Type":"ContainerStarted","Data":"b97bebb67899d9822e75a8abbe35e90afc36c27eb142a37616d8e8b2446d6a09"} Apr 20 15:00:22.874118 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:22.873671 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:00:22.913519 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:22.913467 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" podStartSLOduration=1.9454322849999999 podStartE2EDuration="4.913450172s" podCreationTimestamp="2026-04-20 15:00:18 +0000 UTC" firstStartedPulling="2026-04-20 15:00:19.152604852 +0000 UTC m=+408.971230373" lastFinishedPulling="2026-04-20 15:00:22.120622737 +0000 UTC m=+411.939248260" observedRunningTime="2026-04-20 15:00:22.911475967 +0000 UTC m=+412.730101509" watchObservedRunningTime="2026-04-20 15:00:22.913450172 +0000 UTC m=+412.732075731" Apr 20 15:00:23.645075 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.645024 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:23.645467 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.645453 2538 scope.go:117] "RemoveContainer" containerID="756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24" Apr 20 15:00:23.645645 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:00:23.645629 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bbdhj_opendatahub(f84924d4-96fb-4afc-bde7-04da5bb85ad9)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" podUID="f84924d4-96fb-4afc-bde7-04da5bb85ad9" Apr 20 15:00:23.894174 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.894135 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g548h"] Apr 20 15:00:23.896076 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.896024 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:23.901284 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.901258 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 15:00:23.901910 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.901889 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 15:00:23.902033 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.901990 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-g95fm\"" Apr 20 15:00:23.910592 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:23.910563 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g548h"] Apr 20 15:00:24.016337 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.016295 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svhkr\" (UniqueName: \"kubernetes.io/projected/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-kube-api-access-svhkr\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.016566 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.016473 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.117600 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.117560 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.117793 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.117625 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svhkr\" (UniqueName: \"kubernetes.io/projected/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-kube-api-access-svhkr\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.120190 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.120165 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.128206 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.128181 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svhkr\" (UniqueName: \"kubernetes.io/projected/0f24ac59-0cd6-46bb-ac84-a9c8af02bf47-kube-api-access-svhkr\") pod \"servicemesh-operator3-55f49c5f94-g548h\" (UID: \"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.205789 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.205704 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:24.336289 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.336120 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g548h"] Apr 20 15:00:24.339508 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:00:24.339474 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f24ac59_0cd6_46bb_ac84_a9c8af02bf47.slice/crio-1f02c3d8d3a76ca526c567cadd014808a30bddad61881bd1a87442f47701fb19 WatchSource:0}: Error finding container 1f02c3d8d3a76ca526c567cadd014808a30bddad61881bd1a87442f47701fb19: Status 404 returned error can't find the container with id 1f02c3d8d3a76ca526c567cadd014808a30bddad61881bd1a87442f47701fb19 Apr 20 15:00:24.881606 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:24.881566 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" event={"ID":"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47","Type":"ContainerStarted","Data":"1f02c3d8d3a76ca526c567cadd014808a30bddad61881bd1a87442f47701fb19"} Apr 20 15:00:27.891572 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:27.891533 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" event={"ID":"0f24ac59-0cd6-46bb-ac84-a9c8af02bf47","Type":"ContainerStarted","Data":"bedcf1ca01e4bd86a61472fbbf5698bcc828b57a61fa8a4794bb50df8332b460"} Apr 20 15:00:27.891978 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:27.891643 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:27.911981 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:27.911923 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" podStartSLOduration=2.24667675 podStartE2EDuration="4.911903928s" podCreationTimestamp="2026-04-20 15:00:23 +0000 UTC" firstStartedPulling="2026-04-20 15:00:24.342952099 +0000 UTC m=+414.161577621" lastFinishedPulling="2026-04-20 15:00:27.008179279 +0000 UTC m=+416.826804799" observedRunningTime="2026-04-20 15:00:27.909689282 +0000 UTC m=+417.728314850" watchObservedRunningTime="2026-04-20 15:00:27.911903928 +0000 UTC m=+417.730529472" Apr 20 15:00:33.644726 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:33.644694 2538 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:33.645078 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:33.645062 2538 scope.go:117] "RemoveContainer" containerID="756c3f7d86378f5225ebbb05fd2f40300bf0d838c68f379e98ed2c02109dae24" Apr 20 15:00:34.917508 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:34.917465 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" event={"ID":"f84924d4-96fb-4afc-bde7-04da5bb85ad9","Type":"ContainerStarted","Data":"9e07178c3f6b15151f6a67922fae262149f0157a8833417e75b93ce1939793ff"} Apr 20 15:00:34.917927 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:34.917675 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:34.936947 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:34.936901 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" podStartSLOduration=3.7519828410000002 podStartE2EDuration="23.936886395s" podCreationTimestamp="2026-04-20 15:00:11 +0000 UTC" firstStartedPulling="2026-04-20 15:00:13.766030749 +0000 UTC m=+403.584656271" lastFinishedPulling="2026-04-20 15:00:33.950934304 +0000 UTC m=+423.769559825" observedRunningTime="2026-04-20 15:00:34.935193903 +0000 UTC m=+424.753819447" watchObservedRunningTime="2026-04-20 15:00:34.936886395 +0000 UTC m=+424.755511938" Apr 20 15:00:38.888767 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.888731 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8"] Apr 20 15:00:38.890968 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.890948 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.893999 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.893977 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 15:00:38.894136 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.894037 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-j6vhr\"" Apr 20 15:00:38.894136 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.894068 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 15:00:38.894255 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.893973 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:00:38.894255 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.894011 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 15:00:38.897483 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.897461 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g548h" Apr 20 15:00:38.903334 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.903312 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8"] Apr 20 15:00:38.932614 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932583 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/85351691-c4e3-449f-b83c-c56025af38f9-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.932614 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932618 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.932828 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932734 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.932828 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932795 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.932931 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932861 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rlq\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-kube-api-access-89rlq\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.932931 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932919 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:38.933024 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:38.932952 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.033749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033714 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.033749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033754 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89rlq\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-kube-api-access-89rlq\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034004 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033777 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034004 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033799 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034004 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033847 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/85351691-c4e3-449f-b83c-c56025af38f9-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034004 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033863 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034004 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.033914 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.034712 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.034681 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.036396 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.036367 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.036396 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.036387 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.036611 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.036589 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/85351691-c4e3-449f-b83c-c56025af38f9-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.036659 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.036618 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/85351691-c4e3-449f-b83c-c56025af38f9-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.045703 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.045681 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rlq\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-kube-api-access-89rlq\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.045806 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.045740 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/85351691-c4e3-449f-b83c-c56025af38f9-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ht8z8\" (UID: \"85351691-c4e3-449f-b83c-c56025af38f9\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.201945 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.201847 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:39.331375 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.331195 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8"] Apr 20 15:00:39.334135 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:00:39.334089 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85351691_c4e3_449f_b83c_c56025af38f9.slice/crio-b35c4061a0a63c09fa73209903d610f9e805a09acdc8eb0274a7bcf90c6f234b WatchSource:0}: Error finding container b35c4061a0a63c09fa73209903d610f9e805a09acdc8eb0274a7bcf90c6f234b: Status 404 returned error can't find the container with id b35c4061a0a63c09fa73209903d610f9e805a09acdc8eb0274a7bcf90c6f234b Apr 20 15:00:39.933904 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:39.933866 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" event={"ID":"85351691-c4e3-449f-b83c-c56025af38f9","Type":"ContainerStarted","Data":"b35c4061a0a63c09fa73209903d610f9e805a09acdc8eb0274a7bcf90c6f234b"} Apr 20 15:00:41.757876 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:41.757822 2538 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 15:00:41.758276 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:41.757924 2538 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 15:00:41.943420 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:41.943382 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" event={"ID":"85351691-c4e3-449f-b83c-c56025af38f9","Type":"ContainerStarted","Data":"fc83c9c2e92613d5c8f5cd577d95f538d06adfff013d9d6994d19c4794aa2944"} Apr 20 15:00:41.943594 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:41.943498 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:41.962953 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:41.962899 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" podStartSLOduration=1.541500062 podStartE2EDuration="3.962884655s" podCreationTimestamp="2026-04-20 15:00:38 +0000 UTC" firstStartedPulling="2026-04-20 15:00:39.336188329 +0000 UTC m=+429.154813853" lastFinishedPulling="2026-04-20 15:00:41.757572911 +0000 UTC m=+431.576198446" observedRunningTime="2026-04-20 15:00:41.962151279 +0000 UTC m=+431.780776822" watchObservedRunningTime="2026-04-20 15:00:41.962884655 +0000 UTC m=+431.781510198" Apr 20 15:00:42.948546 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:42.948510 2538 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-ht8z8 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 15:00:42.948933 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:42.948573 2538 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" podUID="85351691-c4e3-449f-b83c-c56025af38f9" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:00:45.922996 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:45.922956 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-bbdhj" Apr 20 15:00:45.947676 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:45.947647 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ht8z8" Apr 20 15:00:53.882549 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:00:53.882474 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-8jv28" Apr 20 15:01:40.217356 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.217313 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-gcn5c"] Apr 20 15:01:40.219420 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.219402 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:40.222436 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.222411 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-8w5pz\"" Apr 20 15:01:40.222531 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.222435 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:01:40.223770 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.223751 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:01:40.231727 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.231707 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-gcn5c"] Apr 20 15:01:40.343682 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.343642 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcm4g\" (UniqueName: \"kubernetes.io/projected/24f747ee-27b3-4268-81d2-0646d4176133-kube-api-access-mcm4g\") pod \"authorino-operator-657f44b778-gcn5c\" (UID: \"24f747ee-27b3-4268-81d2-0646d4176133\") " pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:40.444978 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.444943 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcm4g\" (UniqueName: \"kubernetes.io/projected/24f747ee-27b3-4268-81d2-0646d4176133-kube-api-access-mcm4g\") pod \"authorino-operator-657f44b778-gcn5c\" (UID: \"24f747ee-27b3-4268-81d2-0646d4176133\") " pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:40.453567 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.453537 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcm4g\" (UniqueName: \"kubernetes.io/projected/24f747ee-27b3-4268-81d2-0646d4176133-kube-api-access-mcm4g\") pod \"authorino-operator-657f44b778-gcn5c\" (UID: \"24f747ee-27b3-4268-81d2-0646d4176133\") " pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:40.529075 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.529002 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:40.648390 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:40.648366 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-gcn5c"] Apr 20 15:01:40.651059 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:01:40.651027 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f747ee_27b3_4268_81d2_0646d4176133.slice/crio-21ba776fcb0c5355add2b319cf25e5aa8d5f406826bd12da369551f15aab3cd2 WatchSource:0}: Error finding container 21ba776fcb0c5355add2b319cf25e5aa8d5f406826bd12da369551f15aab3cd2: Status 404 returned error can't find the container with id 21ba776fcb0c5355add2b319cf25e5aa8d5f406826bd12da369551f15aab3cd2 Apr 20 15:01:41.130044 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:41.130004 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" event={"ID":"24f747ee-27b3-4268-81d2-0646d4176133","Type":"ContainerStarted","Data":"21ba776fcb0c5355add2b319cf25e5aa8d5f406826bd12da369551f15aab3cd2"} Apr 20 15:01:43.138246 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:43.138207 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" event={"ID":"24f747ee-27b3-4268-81d2-0646d4176133","Type":"ContainerStarted","Data":"4a9d9806b2986decf448d5e0b9bdbf51dde4497e4a089e648adf989f114b7bae"} Apr 20 15:01:43.138670 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:43.138355 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:43.159401 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:43.159335 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" podStartSLOduration=1.5159438939999998 podStartE2EDuration="3.159319856s" podCreationTimestamp="2026-04-20 15:01:40 +0000 UTC" firstStartedPulling="2026-04-20 15:01:40.653065919 +0000 UTC m=+490.471691440" lastFinishedPulling="2026-04-20 15:01:42.296441881 +0000 UTC m=+492.115067402" observedRunningTime="2026-04-20 15:01:43.158333506 +0000 UTC m=+492.976959052" watchObservedRunningTime="2026-04-20 15:01:43.159319856 +0000 UTC m=+492.977945398" Apr 20 15:01:45.371963 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.371928 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb"] Apr 20 15:01:45.374047 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.374031 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:45.376632 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.376606 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-s85ph\"" Apr 20 15:01:45.384087 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.384068 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb"] Apr 20 15:01:45.486517 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.486483 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94f2b\" (UniqueName: \"kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b\") pod \"limitador-operator-controller-manager-85c4996f8c-5llfb\" (UID: \"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:45.587852 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.587816 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94f2b\" (UniqueName: \"kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b\") pod \"limitador-operator-controller-manager-85c4996f8c-5llfb\" (UID: \"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:45.596820 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.596791 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94f2b\" (UniqueName: \"kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b\") pod \"limitador-operator-controller-manager-85c4996f8c-5llfb\" (UID: \"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:45.684743 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.684674 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:45.799741 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:45.799716 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb"] Apr 20 15:01:45.802304 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:01:45.802276 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5d968e_8d6c_4360_8e8d_dbcf44e3ff53.slice/crio-4b6d69bb33516504cbe7eee9294287a8e19becf13081fc17286c6616ac02f6c4 WatchSource:0}: Error finding container 4b6d69bb33516504cbe7eee9294287a8e19becf13081fc17286c6616ac02f6c4: Status 404 returned error can't find the container with id 4b6d69bb33516504cbe7eee9294287a8e19becf13081fc17286c6616ac02f6c4 Apr 20 15:01:46.153130 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:46.153094 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" event={"ID":"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53","Type":"ContainerStarted","Data":"4b6d69bb33516504cbe7eee9294287a8e19becf13081fc17286c6616ac02f6c4"} Apr 20 15:01:49.166243 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:49.166210 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" event={"ID":"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53","Type":"ContainerStarted","Data":"74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9"} Apr 20 15:01:49.166648 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:49.166370 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:49.185311 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:49.185242 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" podStartSLOduration=1.630526717 podStartE2EDuration="4.18522439s" podCreationTimestamp="2026-04-20 15:01:45 +0000 UTC" firstStartedPulling="2026-04-20 15:01:45.804214705 +0000 UTC m=+495.622840226" lastFinishedPulling="2026-04-20 15:01:48.358912373 +0000 UTC m=+498.177537899" observedRunningTime="2026-04-20 15:01:49.18497884 +0000 UTC m=+499.003604394" watchObservedRunningTime="2026-04-20 15:01:49.18522439 +0000 UTC m=+499.003849933" Apr 20 15:01:54.143822 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:54.143790 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-gcn5c" Apr 20 15:01:56.407860 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.407830 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb"] Apr 20 15:01:56.408323 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.408072 2538 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" containerName="manager" containerID="cri-o://74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9" gracePeriod=2 Apr 20 15:01:56.409565 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.409539 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:56.417063 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.417037 2538 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb"] Apr 20 15:01:56.418615 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.418563 2538 status_manager.go:895] "Failed to get status for pod" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" err="pods \"limitador-operator-controller-manager-85c4996f8c-5llfb\" is forbidden: User \"system:node:ip-10-0-142-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-255.ec2.internal' and this object" Apr 20 15:01:56.431760 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.431737 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw"] Apr 20 15:01:56.432033 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.432016 2538 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" containerName="manager" Apr 20 15:01:56.432033 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.432031 2538 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" containerName="manager" Apr 20 15:01:56.432166 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.432100 2538 memory_manager.go:356] "RemoveStaleState removing state" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" containerName="manager" Apr 20 15:01:56.433830 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.433812 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:56.436291 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.436259 2538 status_manager.go:895] "Failed to get status for pod" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" err="pods \"limitador-operator-controller-manager-85c4996f8c-5llfb\" is forbidden: User \"system:node:ip-10-0-142-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-255.ec2.internal' and this object" Apr 20 15:01:56.443985 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.443965 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw"] Apr 20 15:01:56.575063 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.575032 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzsqc\" (UniqueName: \"kubernetes.io/projected/972c97bb-ea8a-41d7-9baf-7a6e83378abf-kube-api-access-pzsqc\") pod \"limitador-operator-controller-manager-85c4996f8c-rmvcw\" (UID: \"972c97bb-ea8a-41d7-9baf-7a6e83378abf\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:56.634332 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.634310 2538 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:56.636988 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.636961 2538 status_manager.go:895] "Failed to get status for pod" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" err="pods \"limitador-operator-controller-manager-85c4996f8c-5llfb\" is forbidden: User \"system:node:ip-10-0-142-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-255.ec2.internal' and this object" Apr 20 15:01:56.675639 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.675573 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzsqc\" (UniqueName: \"kubernetes.io/projected/972c97bb-ea8a-41d7-9baf-7a6e83378abf-kube-api-access-pzsqc\") pod \"limitador-operator-controller-manager-85c4996f8c-rmvcw\" (UID: \"972c97bb-ea8a-41d7-9baf-7a6e83378abf\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:56.700773 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.700746 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzsqc\" (UniqueName: \"kubernetes.io/projected/972c97bb-ea8a-41d7-9baf-7a6e83378abf-kube-api-access-pzsqc\") pod \"limitador-operator-controller-manager-85c4996f8c-rmvcw\" (UID: \"972c97bb-ea8a-41d7-9baf-7a6e83378abf\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:56.776431 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.776397 2538 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94f2b\" (UniqueName: \"kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b\") pod \"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53\" (UID: \"db5d968e-8d6c-4360-8e8d-dbcf44e3ff53\") " Apr 20 15:01:56.778506 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.778476 2538 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b" (OuterVolumeSpecName: "kube-api-access-94f2b") pod "db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" (UID: "db5d968e-8d6c-4360-8e8d-dbcf44e3ff53"). InnerVolumeSpecName "kube-api-access-94f2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:56.793615 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.793586 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:56.878013 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.877981 2538 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94f2b\" (UniqueName: \"kubernetes.io/projected/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53-kube-api-access-94f2b\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 20 15:01:56.930034 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:56.930010 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw"] Apr 20 15:01:56.932127 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:01:56.932104 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972c97bb_ea8a_41d7_9baf_7a6e83378abf.slice/crio-f74e39246a44d55d0913d61b51653bb60e29a2a54ae1535523b47f917cee501c WatchSource:0}: Error finding container f74e39246a44d55d0913d61b51653bb60e29a2a54ae1535523b47f917cee501c: Status 404 returned error can't find the container with id f74e39246a44d55d0913d61b51653bb60e29a2a54ae1535523b47f917cee501c Apr 20 15:01:57.193272 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.193180 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" event={"ID":"972c97bb-ea8a-41d7-9baf-7a6e83378abf","Type":"ContainerStarted","Data":"192c3324e5f3b40bac6a991c0369766df068272dc9539ce82e218881bd54f8b6"} Apr 20 15:01:57.193272 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.193219 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" event={"ID":"972c97bb-ea8a-41d7-9baf-7a6e83378abf","Type":"ContainerStarted","Data":"f74e39246a44d55d0913d61b51653bb60e29a2a54ae1535523b47f917cee501c"} Apr 20 15:01:57.193524 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.193269 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:01:57.194281 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.194259 2538 generic.go:358] "Generic (PLEG): container finished" podID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" containerID="74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9" exitCode=0 Apr 20 15:01:57.194393 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.194296 2538 scope.go:117] "RemoveContainer" containerID="74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9" Apr 20 15:01:57.194393 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.194295 2538 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5llfb" Apr 20 15:01:57.205916 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.205900 2538 scope.go:117] "RemoveContainer" containerID="74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9" Apr 20 15:01:57.206149 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:01:57.206133 2538 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9\": container with ID starting with 74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9 not found: ID does not exist" containerID="74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9" Apr 20 15:01:57.206207 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.206155 2538 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9"} err="failed to get container status \"74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9\": rpc error: code = NotFound desc = could not find container \"74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9\": container with ID starting with 74b0a746c9e93c4bbf850e3b8c13b00437f5eb8d56bdc3056089ff971875c7d9 not found: ID does not exist" Apr 20 15:01:57.222331 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:57.222281 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" podStartSLOduration=1.222263889 podStartE2EDuration="1.222263889s" podCreationTimestamp="2026-04-20 15:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:01:57.221432451 +0000 UTC m=+507.040057995" watchObservedRunningTime="2026-04-20 15:01:57.222263889 +0000 UTC m=+507.040889433" Apr 20 15:01:58.706652 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:01:58.706620 2538 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5d968e-8d6c-4360-8e8d-dbcf44e3ff53" path="/var/lib/kubelet/pods/db5d968e-8d6c-4360-8e8d-dbcf44e3ff53/volumes" Apr 20 15:02:08.201091 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:08.201056 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rmvcw" Apr 20 15:02:29.687685 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.687638 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:29.691217 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.691190 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:29.693853 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.693829 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jv4pj\"" Apr 20 15:02:29.696849 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.696826 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:29.735923 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.735886 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjs2\" (UniqueName: \"kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2\") pod \"authorino-f99f4b5cd-lgbz5\" (UID: \"9d540913-a044-4f67-97e6-690492bf4fbd\") " pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:29.837162 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.837132 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjs2\" (UniqueName: \"kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2\") pod \"authorino-f99f4b5cd-lgbz5\" (UID: \"9d540913-a044-4f67-97e6-690492bf4fbd\") " pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:29.845612 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:29.845589 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjs2\" (UniqueName: \"kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2\") pod \"authorino-f99f4b5cd-lgbz5\" (UID: \"9d540913-a044-4f67-97e6-690492bf4fbd\") " pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:30.004293 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:30.004217 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:30.121078 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:30.121050 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:30.124014 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:02:30.123979 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d540913_a044_4f67_97e6_690492bf4fbd.slice/crio-9e46437cacc0cfa3cd26d34d47e28b19d12baf74b45afbcb28a0bf71082ab295 WatchSource:0}: Error finding container 9e46437cacc0cfa3cd26d34d47e28b19d12baf74b45afbcb28a0bf71082ab295: Status 404 returned error can't find the container with id 9e46437cacc0cfa3cd26d34d47e28b19d12baf74b45afbcb28a0bf71082ab295 Apr 20 15:02:30.306274 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:30.306195 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" event={"ID":"9d540913-a044-4f67-97e6-690492bf4fbd","Type":"ContainerStarted","Data":"9e46437cacc0cfa3cd26d34d47e28b19d12baf74b45afbcb28a0bf71082ab295"} Apr 20 15:02:34.322446 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:34.322409 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" event={"ID":"9d540913-a044-4f67-97e6-690492bf4fbd","Type":"ContainerStarted","Data":"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425"} Apr 20 15:02:34.338078 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:34.338031 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" podStartSLOduration=1.909688759 podStartE2EDuration="5.33801715s" podCreationTimestamp="2026-04-20 15:02:29 +0000 UTC" firstStartedPulling="2026-04-20 15:02:30.125592123 +0000 UTC m=+539.944217647" lastFinishedPulling="2026-04-20 15:02:33.553920516 +0000 UTC m=+543.372546038" observedRunningTime="2026-04-20 15:02:34.336637275 +0000 UTC m=+544.155262812" watchObservedRunningTime="2026-04-20 15:02:34.33801715 +0000 UTC m=+544.156642694" Apr 20 15:02:34.809322 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:34.809242 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:36.329254 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:36.329208 2538 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" podUID="9d540913-a044-4f67-97e6-690492bf4fbd" containerName="authorino" containerID="cri-o://e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425" gracePeriod=30 Apr 20 15:02:36.567504 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:36.567480 2538 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:36.591538 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:36.591465 2538 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cjs2\" (UniqueName: \"kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2\") pod \"9d540913-a044-4f67-97e6-690492bf4fbd\" (UID: \"9d540913-a044-4f67-97e6-690492bf4fbd\") " Apr 20 15:02:36.593519 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:36.593491 2538 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2" (OuterVolumeSpecName: "kube-api-access-6cjs2") pod "9d540913-a044-4f67-97e6-690492bf4fbd" (UID: "9d540913-a044-4f67-97e6-690492bf4fbd"). InnerVolumeSpecName "kube-api-access-6cjs2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:36.692077 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:36.692039 2538 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cjs2\" (UniqueName: \"kubernetes.io/projected/9d540913-a044-4f67-97e6-690492bf4fbd-kube-api-access-6cjs2\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 20 15:02:37.333045 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.333009 2538 generic.go:358] "Generic (PLEG): container finished" podID="9d540913-a044-4f67-97e6-690492bf4fbd" containerID="e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425" exitCode=0 Apr 20 15:02:37.333563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.333060 2538 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" Apr 20 15:02:37.333563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.333097 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" event={"ID":"9d540913-a044-4f67-97e6-690492bf4fbd","Type":"ContainerDied","Data":"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425"} Apr 20 15:02:37.333563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.333143 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lgbz5" event={"ID":"9d540913-a044-4f67-97e6-690492bf4fbd","Type":"ContainerDied","Data":"9e46437cacc0cfa3cd26d34d47e28b19d12baf74b45afbcb28a0bf71082ab295"} Apr 20 15:02:37.333563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.333168 2538 scope.go:117] "RemoveContainer" containerID="e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425" Apr 20 15:02:37.341151 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.341127 2538 scope.go:117] "RemoveContainer" containerID="e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425" Apr 20 15:02:37.341419 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:02:37.341401 2538 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425\": container with ID starting with e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425 not found: ID does not exist" containerID="e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425" Apr 20 15:02:37.341468 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.341430 2538 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425"} err="failed to get container status \"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425\": rpc error: code = NotFound desc = could not find container \"e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425\": container with ID starting with e5914d1b600c418a91cffa08b5e0210e7b42da76de733de013a7c59edc36f425 not found: ID does not exist" Apr 20 15:02:37.358501 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.358471 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:37.362745 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:37.362721 2538 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lgbz5"] Apr 20 15:02:38.705755 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:02:38.705721 2538 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d540913-a044-4f67-97e6-690492bf4fbd" path="/var/lib/kubelet/pods/9d540913-a044-4f67-97e6-690492bf4fbd/volumes" Apr 20 15:03:03.211531 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.211491 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xt9v"] Apr 20 15:03:03.212029 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.211796 2538 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d540913-a044-4f67-97e6-690492bf4fbd" containerName="authorino" Apr 20 15:03:03.212029 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.211808 2538 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d540913-a044-4f67-97e6-690492bf4fbd" containerName="authorino" Apr 20 15:03:03.212029 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.211853 2538 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d540913-a044-4f67-97e6-690492bf4fbd" containerName="authorino" Apr 20 15:03:03.213905 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.213888 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.216921 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.216902 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jv4pj\"" Apr 20 15:03:03.221240 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.221219 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xt9v"] Apr 20 15:03:03.307768 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.307726 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjkm\" (UniqueName: \"kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm\") pod \"authorino-8b475cf9f-9xt9v\" (UID: \"f6753b2a-6c85-4695-9134-6a3c044b7664\") " pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.388402 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.388366 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xt9v"] Apr 20 15:03:03.388612 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:03:03.388590 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4cjkm], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-9xt9v" podUID="f6753b2a-6c85-4695-9134-6a3c044b7664" Apr 20 15:03:03.408546 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.408501 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjkm\" (UniqueName: \"kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm\") pod \"authorino-8b475cf9f-9xt9v\" (UID: \"f6753b2a-6c85-4695-9134-6a3c044b7664\") " pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.409988 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.409967 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-tbf47"] Apr 20 15:03:03.412011 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.411995 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.414567 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.414544 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 15:03:03.419299 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.419164 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.421413 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.421392 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-tbf47"] Apr 20 15:03:03.422801 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.422777 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjkm\" (UniqueName: \"kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm\") pod \"authorino-8b475cf9f-9xt9v\" (UID: \"f6753b2a-6c85-4695-9134-6a3c044b7664\") " pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.432828 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.432804 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:03.461713 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.461630 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-tbf47"] Apr 20 15:03:03.461870 ip-10-0-142-255 kubenswrapper[2538]: E0420 15:03:03.461853 2538 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8rgm2 tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-tbf47" podUID="a4f0170e-e725-4798-8f07-21188b2fc7d6" Apr 20 15:03:03.509134 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.509096 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rgm2\" (UniqueName: \"kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.509305 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.509227 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.610336 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.610294 2538 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cjkm\" (UniqueName: \"kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm\") pod \"f6753b2a-6c85-4695-9134-6a3c044b7664\" (UID: \"f6753b2a-6c85-4695-9134-6a3c044b7664\") " Apr 20 15:03:03.610558 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.610480 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.610558 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.610512 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rgm2\" (UniqueName: \"kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.612535 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.612497 2538 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm" (OuterVolumeSpecName: "kube-api-access-4cjkm") pod "f6753b2a-6c85-4695-9134-6a3c044b7664" (UID: "f6753b2a-6c85-4695-9134-6a3c044b7664"). InnerVolumeSpecName "kube-api-access-4cjkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:03.612852 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.612833 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.621119 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.621091 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rgm2\" (UniqueName: \"kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2\") pod \"authorino-56fdd757f5-tbf47\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:03.711506 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:03.711475 2538 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cjkm\" (UniqueName: \"kubernetes.io/projected/f6753b2a-6c85-4695-9134-6a3c044b7664-kube-api-access-4cjkm\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.423147 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.423111 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:04.423559 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.423159 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xt9v" Apr 20 15:03:04.428537 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.428514 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:04.455515 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.453011 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xt9v"] Apr 20 15:03:04.459415 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.459380 2538 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xt9v"] Apr 20 15:03:04.517618 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.517581 2538 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rgm2\" (UniqueName: \"kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2\") pod \"a4f0170e-e725-4798-8f07-21188b2fc7d6\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " Apr 20 15:03:04.517787 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.517644 2538 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert\") pod \"a4f0170e-e725-4798-8f07-21188b2fc7d6\" (UID: \"a4f0170e-e725-4798-8f07-21188b2fc7d6\") " Apr 20 15:03:04.519773 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.519746 2538 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "a4f0170e-e725-4798-8f07-21188b2fc7d6" (UID: "a4f0170e-e725-4798-8f07-21188b2fc7d6"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:03:04.519882 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.519741 2538 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2" (OuterVolumeSpecName: "kube-api-access-8rgm2") pod "a4f0170e-e725-4798-8f07-21188b2fc7d6" (UID: "a4f0170e-e725-4798-8f07-21188b2fc7d6"). InnerVolumeSpecName "kube-api-access-8rgm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:04.619246 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.619205 2538 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a4f0170e-e725-4798-8f07-21188b2fc7d6-tls-cert\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.619246 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.619239 2538 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rgm2\" (UniqueName: \"kubernetes.io/projected/a4f0170e-e725-4798-8f07-21188b2fc7d6-kube-api-access-8rgm2\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.706882 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:04.706795 2538 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6753b2a-6c85-4695-9134-6a3c044b7664" path="/var/lib/kubelet/pods/f6753b2a-6c85-4695-9134-6a3c044b7664/volumes" Apr 20 15:03:05.427016 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:05.426986 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-tbf47" Apr 20 15:03:05.452949 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:05.452912 2538 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-tbf47"] Apr 20 15:03:05.454279 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:05.454256 2538 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-tbf47"] Apr 20 15:03:06.706557 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:06.706524 2538 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f0170e-e725-4798-8f07-21188b2fc7d6" path="/var/lib/kubelet/pods/a4f0170e-e725-4798-8f07-21188b2fc7d6/volumes" Apr 20 15:03:53.472201 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.472123 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5cbc9fdf6-pwbwn"] Apr 20 15:03:53.475311 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.475288 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.477879 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.477857 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:03:53.479190 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.479172 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-t7lp5\"" Apr 20 15:03:53.479270 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.479238 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:03:53.483438 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.483417 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5cbc9fdf6-pwbwn"] Apr 20 15:03:53.596196 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.596157 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/49533dbe-947d-4a49-959b-62f418f4ec13-maas-api-tls\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.596411 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.596267 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzb77\" (UniqueName: \"kubernetes.io/projected/49533dbe-947d-4a49-959b-62f418f4ec13-kube-api-access-gzb77\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.696896 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.696862 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzb77\" (UniqueName: \"kubernetes.io/projected/49533dbe-947d-4a49-959b-62f418f4ec13-kube-api-access-gzb77\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.697073 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.696917 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/49533dbe-947d-4a49-959b-62f418f4ec13-maas-api-tls\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.699281 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.699260 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/49533dbe-947d-4a49-959b-62f418f4ec13-maas-api-tls\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.704292 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.704265 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzb77\" (UniqueName: \"kubernetes.io/projected/49533dbe-947d-4a49-959b-62f418f4ec13-kube-api-access-gzb77\") pod \"maas-api-5cbc9fdf6-pwbwn\" (UID: \"49533dbe-947d-4a49-959b-62f418f4ec13\") " pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.786978 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.786892 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:53.915637 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:53.915611 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5cbc9fdf6-pwbwn"] Apr 20 15:03:53.918546 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:03:53.918517 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49533dbe_947d_4a49_959b_62f418f4ec13.slice/crio-09c9b2a0167cbedce68a9e4a0cbdffa8dc60ed122430eb2162e121342d32bb46 WatchSource:0}: Error finding container 09c9b2a0167cbedce68a9e4a0cbdffa8dc60ed122430eb2162e121342d32bb46: Status 404 returned error can't find the container with id 09c9b2a0167cbedce68a9e4a0cbdffa8dc60ed122430eb2162e121342d32bb46 Apr 20 15:03:54.579705 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:54.579659 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" event={"ID":"49533dbe-947d-4a49-959b-62f418f4ec13","Type":"ContainerStarted","Data":"09c9b2a0167cbedce68a9e4a0cbdffa8dc60ed122430eb2162e121342d32bb46"} Apr 20 15:03:56.589686 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:56.589638 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" event={"ID":"49533dbe-947d-4a49-959b-62f418f4ec13","Type":"ContainerStarted","Data":"6101de1839787cb966a721daaa47af1e9d88943517449c2afee374b3b4d3e300"} Apr 20 15:03:56.590411 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:56.590389 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:03:56.606279 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:03:56.606237 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" podStartSLOduration=1.305101713 podStartE2EDuration="3.606219437s" podCreationTimestamp="2026-04-20 15:03:53 +0000 UTC" firstStartedPulling="2026-04-20 15:03:53.920188086 +0000 UTC m=+623.738813607" lastFinishedPulling="2026-04-20 15:03:56.221305796 +0000 UTC m=+626.039931331" observedRunningTime="2026-04-20 15:03:56.604901631 +0000 UTC m=+626.423527171" watchObservedRunningTime="2026-04-20 15:03:56.606219437 +0000 UTC m=+626.424844980" Apr 20 15:04:02.624305 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.624270 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x"] Apr 20 15:04:02.627597 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.627576 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.630280 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.630250 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:04:02.630426 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.630320 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:04:02.631590 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.631562 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-t5pz2\"" Apr 20 15:04:02.631682 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.631587 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 15:04:02.639325 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.639296 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x"] Apr 20 15:04:02.773294 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773242 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.773294 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773296 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.773581 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773362 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.773581 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773441 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj2v\" (UniqueName: \"kubernetes.io/projected/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kube-api-access-7hj2v\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.773581 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773514 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.773581 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.773550 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9faf4b-08a0-499e-8ecc-028c1a09d000-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.874707 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874593 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.874707 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874661 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj2v\" (UniqueName: \"kubernetes.io/projected/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kube-api-access-7hj2v\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.874707 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874691 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.874707 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874710 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9faf4b-08a0-499e-8ecc-028c1a09d000-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.875045 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874770 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.875045 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.874806 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.875142 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.875062 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.875142 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.875108 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.875206 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.875171 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.877171 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.877144 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f9faf4b-08a0-499e-8ecc-028c1a09d000-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.877294 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.877225 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9faf4b-08a0-499e-8ecc-028c1a09d000-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.882563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.882532 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj2v\" (UniqueName: \"kubernetes.io/projected/6f9faf4b-08a0-499e-8ecc-028c1a09d000-kube-api-access-7hj2v\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x\" (UID: \"6f9faf4b-08a0-499e-8ecc-028c1a09d000\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:02.941767 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:02.941732 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:03.069141 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:03.069106 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x"] Apr 20 15:04:03.072654 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:04:03.072627 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9faf4b_08a0_499e_8ecc_028c1a09d000.slice/crio-b332032eb50dcd8fc4818af5de89e85547b2d810347f0932dc671efdcc1af2ae WatchSource:0}: Error finding container b332032eb50dcd8fc4818af5de89e85547b2d810347f0932dc671efdcc1af2ae: Status 404 returned error can't find the container with id b332032eb50dcd8fc4818af5de89e85547b2d810347f0932dc671efdcc1af2ae Apr 20 15:04:03.606867 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:03.606810 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5cbc9fdf6-pwbwn" Apr 20 15:04:03.616544 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:03.616511 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" event={"ID":"6f9faf4b-08a0-499e-8ecc-028c1a09d000","Type":"ContainerStarted","Data":"b332032eb50dcd8fc4818af5de89e85547b2d810347f0932dc671efdcc1af2ae"} Apr 20 15:04:08.636726 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:08.636638 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" event={"ID":"6f9faf4b-08a0-499e-8ecc-028c1a09d000","Type":"ContainerStarted","Data":"784e7f78ac40221e5314be2ac730ccbdabb2c03d376c4b211a5f16d6d994c09a"} Apr 20 15:04:14.657421 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:14.657379 2538 generic.go:358] "Generic (PLEG): container finished" podID="6f9faf4b-08a0-499e-8ecc-028c1a09d000" containerID="784e7f78ac40221e5314be2ac730ccbdabb2c03d376c4b211a5f16d6d994c09a" exitCode=0 Apr 20 15:04:14.657794 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:14.657456 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" event={"ID":"6f9faf4b-08a0-499e-8ecc-028c1a09d000","Type":"ContainerDied","Data":"784e7f78ac40221e5314be2ac730ccbdabb2c03d376c4b211a5f16d6d994c09a"} Apr 20 15:04:16.666356 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:16.666317 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" event={"ID":"6f9faf4b-08a0-499e-8ecc-028c1a09d000","Type":"ContainerStarted","Data":"b4a8f4dd1b5700e762fccc5fd112e60c981013f6c7c2fbe73f917b9da0f9d0a0"} Apr 20 15:04:16.666734 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:16.666554 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:16.685299 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:16.685248 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" podStartSLOduration=1.953711502 podStartE2EDuration="14.685231304s" podCreationTimestamp="2026-04-20 15:04:02 +0000 UTC" firstStartedPulling="2026-04-20 15:04:03.074269605 +0000 UTC m=+632.892895126" lastFinishedPulling="2026-04-20 15:04:15.805789395 +0000 UTC m=+645.624414928" observedRunningTime="2026-04-20 15:04:16.683907429 +0000 UTC m=+646.502533005" watchObservedRunningTime="2026-04-20 15:04:16.685231304 +0000 UTC m=+646.503856874" Apr 20 15:04:27.682777 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:27.682738 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x" Apr 20 15:04:46.725460 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.725427 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67"] Apr 20 15:04:46.765418 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.765385 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67"] Apr 20 15:04:46.765568 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.765497 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.768163 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.768139 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 15:04:46.862157 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862115 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.862157 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862162 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0051bcef-b662-42e5-b613-3e0f25143bf4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.862427 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862218 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.862427 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862296 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gksq4\" (UniqueName: \"kubernetes.io/projected/0051bcef-b662-42e5-b613-3e0f25143bf4-kube-api-access-gksq4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.862427 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862329 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.862427 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.862382 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963495 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963455 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gksq4\" (UniqueName: \"kubernetes.io/projected/0051bcef-b662-42e5-b613-3e0f25143bf4-kube-api-access-gksq4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963495 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963498 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963531 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963565 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963589 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0051bcef-b662-42e5-b613-3e0f25143bf4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.963749 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963632 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.964014 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.963988 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.964014 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.964004 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.964135 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.964071 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.965814 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.965792 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0051bcef-b662-42e5-b613-3e0f25143bf4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.966118 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.966098 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0051bcef-b662-42e5-b613-3e0f25143bf4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:46.970817 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:46.970795 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gksq4\" (UniqueName: \"kubernetes.io/projected/0051bcef-b662-42e5-b613-3e0f25143bf4-kube-api-access-gksq4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-82z67\" (UID: \"0051bcef-b662-42e5-b613-3e0f25143bf4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:47.074527 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:47.074476 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:47.198913 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:47.198888 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67"] Apr 20 15:04:47.201614 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:04:47.201586 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0051bcef_b662_42e5_b613_3e0f25143bf4.slice/crio-142ab745f410e86c0a6755692be4df86910c4fa574e25fd49ca37d30cc937cc0 WatchSource:0}: Error finding container 142ab745f410e86c0a6755692be4df86910c4fa574e25fd49ca37d30cc937cc0: Status 404 returned error can't find the container with id 142ab745f410e86c0a6755692be4df86910c4fa574e25fd49ca37d30cc937cc0 Apr 20 15:04:47.203472 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:47.203454 2538 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:04:47.769044 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:47.769002 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" event={"ID":"0051bcef-b662-42e5-b613-3e0f25143bf4","Type":"ContainerStarted","Data":"0065213be09196f848ee54ee9596cae591ad28cad34abd3996fd1c54242a2f23"} Apr 20 15:04:47.769445 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:47.769052 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" event={"ID":"0051bcef-b662-42e5-b613-3e0f25143bf4","Type":"ContainerStarted","Data":"142ab745f410e86c0a6755692be4df86910c4fa574e25fd49ca37d30cc937cc0"} Apr 20 15:04:56.807458 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:56.807418 2538 generic.go:358] "Generic (PLEG): container finished" podID="0051bcef-b662-42e5-b613-3e0f25143bf4" containerID="0065213be09196f848ee54ee9596cae591ad28cad34abd3996fd1c54242a2f23" exitCode=0 Apr 20 15:04:56.807885 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:56.807488 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" event={"ID":"0051bcef-b662-42e5-b613-3e0f25143bf4","Type":"ContainerDied","Data":"0065213be09196f848ee54ee9596cae591ad28cad34abd3996fd1c54242a2f23"} Apr 20 15:04:57.812928 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:57.812891 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" event={"ID":"0051bcef-b662-42e5-b613-3e0f25143bf4","Type":"ContainerStarted","Data":"4ff6431e46470083c829b237fab56638ea7e74cb65b6885049cf4bab3ec25edc"} Apr 20 15:04:57.813465 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:57.813097 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:04:57.829745 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:04:57.829699 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" podStartSLOduration=11.446306185 podStartE2EDuration="11.82968583s" podCreationTimestamp="2026-04-20 15:04:46 +0000 UTC" firstStartedPulling="2026-04-20 15:04:56.808164771 +0000 UTC m=+686.626790292" lastFinishedPulling="2026-04-20 15:04:57.191544408 +0000 UTC m=+687.010169937" observedRunningTime="2026-04-20 15:04:57.828721371 +0000 UTC m=+687.647346914" watchObservedRunningTime="2026-04-20 15:04:57.82968583 +0000 UTC m=+687.648311372" Apr 20 15:05:08.830251 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:05:08.830219 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-82z67" Apr 20 15:27:52.990233 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:52.990125 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-8jv28_c82f1c0e-3b58-49a9-90bd-72e4090803d6/manager/0.log" Apr 20 15:27:53.123264 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:53.123230 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5cbc9fdf6-pwbwn_49533dbe-947d-4a49-959b-62f418f4ec13/maas-api/0.log" Apr 20 15:27:53.388775 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:53.388739 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bbdhj_f84924d4-96fb-4afc-bde7-04da5bb85ad9/manager/2.log" Apr 20 15:27:53.643843 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:53.643761 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-zflg5_4997ee7e-4a5c-445a-b713-94077b5e7f2d/manager/0.log" Apr 20 15:27:55.523110 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:55.523084 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-gcn5c_24f747ee-27b3-4268-81d2-0646d4176133/manager/0.log" Apr 20 15:27:56.265169 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:56.265137 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rmvcw_972c97bb-ea8a-41d7-9baf-7a6e83378abf/manager/0.log" Apr 20 15:27:56.775933 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:56.775899 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ht8z8_85351691-c4e3-449f-b83c-c56025af38f9/discovery/0.log" Apr 20 15:27:57.014058 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:57.014029 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-597dfdc786-zp9qn_29878348-f049-4527-b442-17692aa7a4f3/kube-auth-proxy/0.log" Apr 20 15:27:57.701688 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:57.701651 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x_6f9faf4b-08a0-499e-8ecc-028c1a09d000/storage-initializer/0.log" Apr 20 15:27:57.709750 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:57.709718 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lxs8x_6f9faf4b-08a0-499e-8ecc-028c1a09d000/main/0.log" Apr 20 15:27:58.389865 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:58.389833 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-82z67_0051bcef-b662-42e5-b613-3e0f25143bf4/storage-initializer/0.log" Apr 20 15:27:58.398542 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:27:58.398516 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-82z67_0051bcef-b662-42e5-b613-3e0f25143bf4/main/0.log" Apr 20 15:28:05.513787 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:05.513729 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t7x4d_4a876a88-f48c-4f22-83fa-9e878cf5029d/global-pull-secret-syncer/0.log" Apr 20 15:28:05.566972 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:05.566939 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-777tg_fa1d6f1c-e0bc-4dae-beb6-31f8ba86f47d/konnectivity-agent/0.log" Apr 20 15:28:05.694571 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:05.694534 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-255.ec2.internal_e688fec9147a531ae0f3ba981a4ec304/haproxy/0.log" Apr 20 15:28:09.746621 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:09.746588 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-gcn5c_24f747ee-27b3-4268-81d2-0646d4176133/manager/0.log" Apr 20 15:28:10.103868 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:10.103830 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rmvcw_972c97bb-ea8a-41d7-9baf-7a6e83378abf/manager/0.log" Apr 20 15:28:12.089871 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.089843 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtbzh_df1a19f5-b9af-4b67-a68b-7d07365aeefa/node-exporter/0.log" Apr 20 15:28:12.117752 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.117723 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtbzh_df1a19f5-b9af-4b67-a68b-7d07365aeefa/kube-rbac-proxy/0.log" Apr 20 15:28:12.150152 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.150130 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtbzh_df1a19f5-b9af-4b67-a68b-7d07365aeefa/init-textfile/0.log" Apr 20 15:28:12.553893 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.553792 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667b67845d-t24gr_34da4e27-67e3-476b-9fe2-5389ceef268e/telemeter-client/0.log" Apr 20 15:28:12.577521 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.577496 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667b67845d-t24gr_34da4e27-67e3-476b-9fe2-5389ceef268e/reload/0.log" Apr 20 15:28:12.608336 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:12.608309 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667b67845d-t24gr_34da4e27-67e3-476b-9fe2-5389ceef268e/kube-rbac-proxy/0.log" Apr 20 15:28:14.594863 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.594829 2538 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2"] Apr 20 15:28:14.598219 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.598195 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.600733 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.600704 2538 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrbql\"/\"default-dockercfg-whr8p\"" Apr 20 15:28:14.600868 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.600740 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"openshift-service-ca.crt\"" Apr 20 15:28:14.601954 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.601936 2538 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"kube-root-ca.crt\"" Apr 20 15:28:14.606807 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.606786 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2"] Apr 20 15:28:14.687553 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.687514 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-podres\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.687553 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.687557 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5n4\" (UniqueName: \"kubernetes.io/projected/2b774ff9-a9f6-4e34-8e96-4ad20732640a-kube-api-access-jn5n4\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.687768 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.687616 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-proc\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.687768 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.687714 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-sys\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.687768 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.687757 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-lib-modules\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789020 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.788984 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-sys\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789030 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-lib-modules\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789058 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-podres\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789075 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5n4\" (UniqueName: \"kubernetes.io/projected/2b774ff9-a9f6-4e34-8e96-4ad20732640a-kube-api-access-jn5n4\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789101 2538 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-proc\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789105 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-sys\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789182 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-proc\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789232 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789198 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-lib-modules\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.789545 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.789237 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b774ff9-a9f6-4e34-8e96-4ad20732640a-podres\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.797867 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.797843 2538 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5n4\" (UniqueName: \"kubernetes.io/projected/2b774ff9-a9f6-4e34-8e96-4ad20732640a-kube-api-access-jn5n4\") pod \"perf-node-gather-daemonset-4zvw2\" (UID: \"2b774ff9-a9f6-4e34-8e96-4ad20732640a\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:14.909274 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:14.909164 2538 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:15.033082 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.033052 2538 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2"] Apr 20 15:28:15.035791 ip-10-0-142-255 kubenswrapper[2538]: W0420 15:28:15.035753 2538 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2b774ff9_a9f6_4e34_8e96_4ad20732640a.slice/crio-be20f715cdbd8d513de0c4af72ea3a3ade13de8540e01647a50005c53834c142 WatchSource:0}: Error finding container be20f715cdbd8d513de0c4af72ea3a3ade13de8540e01647a50005c53834c142: Status 404 returned error can't find the container with id be20f715cdbd8d513de0c4af72ea3a3ade13de8540e01647a50005c53834c142 Apr 20 15:28:15.037522 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.037503 2538 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:28:15.474947 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.474904 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" event={"ID":"2b774ff9-a9f6-4e34-8e96-4ad20732640a","Type":"ContainerStarted","Data":"7f5903a13c7e605a7e0df69acacc56e678579713ce34878914009bbdf26c77a0"} Apr 20 15:28:15.474947 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.474945 2538 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" event={"ID":"2b774ff9-a9f6-4e34-8e96-4ad20732640a","Type":"ContainerStarted","Data":"be20f715cdbd8d513de0c4af72ea3a3ade13de8540e01647a50005c53834c142"} Apr 20 15:28:15.475192 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.475012 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:15.491944 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:15.491896 2538 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" podStartSLOduration=1.491879697 podStartE2EDuration="1.491879697s" podCreationTimestamp="2026-04-20 15:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:28:15.489998842 +0000 UTC m=+2085.308624384" watchObservedRunningTime="2026-04-20 15:28:15.491879697 +0000 UTC m=+2085.310505239" Apr 20 15:28:16.275812 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:16.275785 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v5pr2_3dfaac5e-6a5e-40e4-81fa-19962c0d578f/dns/0.log" Apr 20 15:28:16.299518 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:16.299481 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v5pr2_3dfaac5e-6a5e-40e4-81fa-19962c0d578f/kube-rbac-proxy/0.log" Apr 20 15:28:16.350731 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:16.350698 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-64772_2d674b0f-26a1-44f7-8346-4ad4d666371e/dns-node-resolver/0.log" Apr 20 15:28:16.893083 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:16.893042 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xt2lb_1252f9bc-f9c8-4a62-8bbf-b5e145f0e656/node-ca/0.log" Apr 20 15:28:17.878290 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:17.878256 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ht8z8_85351691-c4e3-449f-b83c-c56025af38f9/discovery/0.log" Apr 20 15:28:17.920204 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:17.920174 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-597dfdc786-zp9qn_29878348-f049-4527-b442-17692aa7a4f3/kube-auth-proxy/0.log" Apr 20 15:28:18.556744 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:18.556707 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kw4l7_50a05c11-ab26-4873-920e-803fdbe14912/serve-healthcheck-canary/0.log" Apr 20 15:28:19.022602 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:19.022570 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdbm4_2b12aef5-3be9-4fe2-909b-17d214356c6c/kube-rbac-proxy/0.log" Apr 20 15:28:19.044925 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:19.044896 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdbm4_2b12aef5-3be9-4fe2-909b-17d214356c6c/exporter/0.log" Apr 20 15:28:19.079561 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:19.079524 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdbm4_2b12aef5-3be9-4fe2-909b-17d214356c6c/extractor/0.log" Apr 20 15:28:21.176026 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.175973 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-8jv28_c82f1c0e-3b58-49a9-90bd-72e4090803d6/manager/0.log" Apr 20 15:28:21.221224 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.221192 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5cbc9fdf6-pwbwn_49533dbe-947d-4a49-959b-62f418f4ec13/maas-api/0.log" Apr 20 15:28:21.302552 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.302524 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bbdhj_f84924d4-96fb-4afc-bde7-04da5bb85ad9/manager/1.log" Apr 20 15:28:21.322058 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.322027 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bbdhj_f84924d4-96fb-4afc-bde7-04da5bb85ad9/manager/2.log" Apr 20 15:28:21.410813 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.410769 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-zflg5_4997ee7e-4a5c-445a-b713-94077b5e7f2d/manager/0.log" Apr 20 15:28:21.489152 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:21.489072 2538 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-4zvw2" Apr 20 15:28:22.643037 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:22.643006 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-sc7tm_7dac5d4d-7074-4cc8-a214-c276e9876766/openshift-lws-operator/0.log" Apr 20 15:28:28.599158 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.599128 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/kube-multus-additional-cni-plugins/0.log" Apr 20 15:28:28.618666 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.618636 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/egress-router-binary-copy/0.log" Apr 20 15:28:28.640851 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.640818 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/cni-plugins/0.log" Apr 20 15:28:28.660099 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.660077 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/bond-cni-plugin/0.log" Apr 20 15:28:28.680109 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.680089 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/routeoverride-cni/0.log" Apr 20 15:28:28.703036 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.703014 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/whereabouts-cni-bincopy/0.log" Apr 20 15:28:28.722323 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:28.722299 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsr4z_2cb834ae-b00c-44c5-8c4b-591c1777bf5f/whereabouts-cni/0.log" Apr 20 15:28:29.105322 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:29.105286 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mw2xg_54adba6d-382e-43b7-9219-644ce4ea5f46/kube-multus/0.log" Apr 20 15:28:29.126642 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:29.126604 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fdmrj_5752b3ca-4688-4db8-9995-af78bc6f30d3/network-metrics-daemon/0.log" Apr 20 15:28:29.158767 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:29.158739 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fdmrj_5752b3ca-4688-4db8-9995-af78bc6f30d3/kube-rbac-proxy/0.log" Apr 20 15:28:30.330864 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.330831 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/ovn-controller/0.log" Apr 20 15:28:30.364872 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.364840 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/ovn-acl-logging/0.log" Apr 20 15:28:30.401726 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.401695 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/kube-rbac-proxy-node/0.log" Apr 20 15:28:30.443804 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.443782 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:28:30.462628 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.462606 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/northd/0.log" Apr 20 15:28:30.483563 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.483539 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/nbdb/0.log" Apr 20 15:28:30.503816 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.503786 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/sbdb/0.log" Apr 20 15:28:30.682035 ip-10-0-142-255 kubenswrapper[2538]: I0420 15:28:30.681949 2538 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q22pz_5e16698f-ad67-45e2-8b90-cd0a144a2469/ovnkube-controller/0.log"