Apr 19 15:21:14.259913 ip-10-0-131-48 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 19 15:21:14.259925 ip-10-0-131-48 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 19 15:21:14.259933 ip-10-0-131-48 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 19 15:21:14.260085 ip-10-0-131-48 systemd[1]: Failed to start Kubernetes Kubelet. Apr 19 15:21:24.317948 ip-10-0-131-48 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 19 15:21:24.317965 ip-10-0-131-48 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0b10652a130a4ec48d06f865009a5230 -- Apr 19 15:24:01.092906 ip-10-0-131-48 systemd[1]: Starting Kubernetes Kubelet... Apr 19 15:24:01.489408 ip-10-0-131-48 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:24:01.489408 ip-10-0-131-48 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 15:24:01.489408 ip-10-0-131-48 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:24:01.489408 ip-10-0-131-48 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 15:24:01.489408 ip-10-0-131-48 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:24:01.490417 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.489981 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 15:24:01.496011 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.495991 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:24:01.496011 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496009 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:24:01.496011 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496014 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:24:01.496011 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496019 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496024 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496028 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496033 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496038 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496042 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496047 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496051 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496055 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496059 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496063 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496067 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496071 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496076 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496081 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496085 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496090 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496094 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496099 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496104 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:24:01.496274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496107 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496111 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496115 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496119 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496126 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496132 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496137 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496142 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496147 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496152 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496158 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496162 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496175 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496180 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496186 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496190 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496195 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496199 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496204 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:24:01.497096 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496208 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496212 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496216 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496221 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496225 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496229 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496234 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496238 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496242 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496247 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496251 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496255 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496260 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496265 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496269 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496273 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496277 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496281 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496285 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496290 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:24:01.497929 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496294 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496298 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496302 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496306 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496311 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496315 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496319 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496326 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496331 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496336 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496340 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496344 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496349 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496353 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496358 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496362 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496367 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496371 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496377 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496381 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:24:01.498802 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496385 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496392 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496398 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.496402 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497077 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497086 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497091 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497097 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497103 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497108 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497113 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497118 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497123 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497128 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497132 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497137 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497141 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497146 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497150 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:24:01.499418 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497154 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497159 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497163 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497168 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497172 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497177 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497181 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497187 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497193 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497197 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497201 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497205 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497209 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497213 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497218 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497222 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497254 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497262 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497268 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497273 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:24:01.499983 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497277 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497282 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497286 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497290 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497294 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497298 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497302 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497307 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497311 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497316 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497321 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497325 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497329 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497334 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497338 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497343 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497347 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497351 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497355 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:24:01.500669 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497359 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497364 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497368 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497372 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497376 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497380 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497385 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497389 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497394 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497398 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497403 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497407 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497413 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497417 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497421 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497425 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497429 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497433 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497438 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497442 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:24:01.501201 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497446 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497450 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497455 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497460 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497464 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497469 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497474 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497478 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497482 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497486 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497490 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.497495 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497594 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497605 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497616 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497623 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497646 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497652 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497666 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497673 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497678 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 15:24:01.501716 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497684 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497689 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497694 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497699 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497706 2564 flags.go:64] FLAG: --cgroup-root="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497710 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497716 2564 flags.go:64] FLAG: --client-ca-file="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497721 2564 flags.go:64] FLAG: --cloud-config="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497725 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497730 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497737 2564 flags.go:64] FLAG: --cluster-domain="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497742 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497747 2564 flags.go:64] FLAG: --config-dir="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497752 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497757 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497764 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497769 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497775 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497781 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497786 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497791 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497796 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497801 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497805 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497812 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 15:24:01.502322 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497817 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497822 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497827 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497835 2564 flags.go:64] FLAG: --enable-server="true" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497840 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497848 2564 flags.go:64] FLAG: --event-burst="100" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497853 2564 flags.go:64] FLAG: --event-qps="50" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497858 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497863 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497869 2564 flags.go:64] FLAG: --eviction-hard="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497875 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497880 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497886 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497891 2564 flags.go:64] FLAG: --eviction-soft="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497896 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497901 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497907 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497911 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497916 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497922 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497926 2564 flags.go:64] FLAG: --feature-gates="" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497932 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497937 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497942 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497947 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497952 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 19 15:24:01.502970 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497957 2564 flags.go:64] FLAG: --help="false" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497962 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497968 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497973 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497978 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497983 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497989 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497993 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.497998 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498005 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498010 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498015 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498020 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498025 2564 flags.go:64] FLAG: --kube-reserved="" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498030 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498035 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498039 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498044 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498049 2564 flags.go:64] FLAG: --lock-file="" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498055 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498060 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498065 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498075 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 15:24:01.503606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498079 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498084 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498089 2564 flags.go:64] FLAG: --logging-format="text" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498094 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498100 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498105 2564 flags.go:64] FLAG: --manifest-url="" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498110 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498116 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498121 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498129 2564 flags.go:64] FLAG: --max-pods="110" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498134 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498139 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498143 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498148 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498153 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498158 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498163 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498175 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498182 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498187 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498192 2564 flags.go:64] FLAG: --pod-cidr="" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498197 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498206 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498211 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 15:24:01.504219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498216 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498221 2564 flags.go:64] FLAG: --port="10250" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498226 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498231 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-001427d9fc2d71605" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498236 2564 flags.go:64] FLAG: --qos-reserved="" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498242 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498247 2564 flags.go:64] FLAG: --register-node="true" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498252 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498260 2564 flags.go:64] FLAG: --register-with-taints="" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498266 2564 flags.go:64] FLAG: --registry-burst="10" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498271 2564 flags.go:64] FLAG: --registry-qps="5" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498276 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498281 2564 flags.go:64] FLAG: --reserved-memory="" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498287 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498293 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498298 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498303 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498308 2564 flags.go:64] FLAG: --runonce="false" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498312 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498317 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498322 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498327 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498332 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498338 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498343 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498348 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 15:24:01.504835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498354 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498359 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498364 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498369 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498374 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498379 2564 flags.go:64] FLAG: --system-cgroups="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498384 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498393 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498397 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498402 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498408 2564 flags.go:64] FLAG: --tls-min-version="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498413 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498418 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498424 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498430 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498435 2564 flags.go:64] FLAG: --v="2" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498442 2564 flags.go:64] FLAG: --version="false" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498448 2564 flags.go:64] FLAG: --vmodule="" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498455 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.498460 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498661 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498668 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498673 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498677 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:24:01.505673 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498682 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498686 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498692 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498699 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498705 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498709 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498714 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498719 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498725 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498730 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498734 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498739 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498743 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498747 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498751 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498756 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498760 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498764 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498771 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:24:01.506488 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498777 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498782 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498788 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498795 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498799 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498804 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498808 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498813 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498817 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498821 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498826 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498830 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498835 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498839 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498844 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498848 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498852 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498857 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498861 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498865 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:24:01.507020 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498869 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498873 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498878 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498882 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498887 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498890 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498895 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498899 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498903 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498907 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498911 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498915 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498920 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498924 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498929 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498936 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498940 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498945 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498949 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498953 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:24:01.507506 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498957 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498962 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498966 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498971 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498975 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498979 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498983 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498987 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498991 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498996 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.498999 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499004 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499008 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499013 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499018 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499022 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499026 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499031 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499035 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:24:01.508025 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499039 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499043 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499047 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.499051 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.499692 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.507553 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.507570 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507617 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507622 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507626 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507642 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507646 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507648 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507651 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507655 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:24:01.508582 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507658 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507662 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507667 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507670 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507674 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507676 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507680 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507682 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507685 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507688 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507691 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507694 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507697 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507700 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507702 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507706 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507708 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507711 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:24:01.508986 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507714 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507717 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507719 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507722 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507725 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507728 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507731 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507734 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507736 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507739 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507742 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507744 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507747 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507749 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507752 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507755 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507759 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507763 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507766 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:24:01.509423 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507768 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507771 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507774 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507777 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507780 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507782 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507785 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507788 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507790 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507793 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507796 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507799 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507801 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507804 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507807 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507809 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507812 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507815 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507818 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507822 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:24:01.509901 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507825 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507827 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507830 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507832 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507835 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507838 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507840 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507843 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507846 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507848 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507851 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507855 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507857 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507860 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507863 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507865 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507868 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507870 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507873 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507875 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:24:01.510429 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507878 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.507883 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507974 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507980 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507983 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507986 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507989 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507991 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507994 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.507999 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508003 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508007 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508011 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508013 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508017 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:24:01.510939 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508019 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508022 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508025 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508027 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508031 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508033 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508036 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508039 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508042 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508045 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508048 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508050 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508053 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508056 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508058 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508061 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508063 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508066 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508068 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508071 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:24:01.511317 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508074 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508076 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508079 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508081 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508084 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508087 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508091 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508094 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508097 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508100 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508103 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508106 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508108 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508111 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508114 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508116 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508119 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508122 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508124 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:24:01.511828 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508127 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508129 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508132 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508136 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508138 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508141 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508143 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508146 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508148 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508151 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508154 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508157 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508159 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508162 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508172 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508175 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508178 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508181 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508183 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508186 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:24:01.512295 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508189 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508192 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508194 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508197 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508200 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508203 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508205 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508208 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508210 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508213 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508216 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508218 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508221 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:01.508224 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.508228 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:24:01.512817 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.508943 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 15:24:01.513187 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.510995 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 15:24:01.513187 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.511869 2564 server.go:1019] "Starting client certificate rotation" Apr 19 15:24:01.513187 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.511968 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 15:24:01.513187 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.512001 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 15:24:01.535336 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.535316 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 15:24:01.537906 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.537885 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 15:24:01.551986 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.551964 2564 log.go:25] "Validated CRI v1 runtime API" Apr 19 15:24:01.557342 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.557328 2564 log.go:25] "Validated CRI v1 image API" Apr 19 15:24:01.558551 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.558530 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 15:24:01.561164 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.561146 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 15:24:01.563772 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.563753 2564 fs.go:135] Filesystem UUIDs: map[2d9e2404-03d8-4f11-914a-8777b548211f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c036191e-2201-4cb2-b915-7b71bfb7da6c:/dev/nvme0n1p3] Apr 19 15:24:01.563837 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.563772 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 15:24:01.570285 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.570129 2564 manager.go:217] Machine: {Timestamp:2026-04-19 15:24:01.568462029 +0000 UTC m=+0.368377579 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100736 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec218a8ba6ba6c4800ea9cc6f1d5f68f SystemUUID:ec218a8b-a6ba-6c48-00ea-9cc6f1d5f68f BootID:0b10652a-130a-4ec4-8d06-f865009a5230 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:76:a5:61:59:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:76:a5:61:59:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:5d:c0:2e:0e:25 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 15:24:01.570285 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.570272 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 15:24:01.570431 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.570391 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 15:24:01.571513 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.571483 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 15:24:01.571670 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.571515 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-48.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 15:24:01.571713 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.571680 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 15:24:01.571713 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.571688 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 15:24:01.571713 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.571702 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 15:24:01.572371 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.572360 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 15:24:01.573994 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.573984 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 19 15:24:01.574272 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.574263 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 15:24:01.576297 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.576287 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 19 15:24:01.576351 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.576301 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 15:24:01.576351 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.576313 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 15:24:01.576351 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.576322 2564 kubelet.go:397] "Adding apiserver pod source" Apr 19 15:24:01.576351 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.576330 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 15:24:01.577467 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.577454 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 15:24:01.577521 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.577474 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 15:24:01.580211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.580194 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 15:24:01.581535 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.581517 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 15:24:01.583182 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583166 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583197 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583208 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583221 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583232 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583243 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583251 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 15:24:01.583267 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583261 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 15:24:01.583517 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583271 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 15:24:01.583517 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583280 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 15:24:01.583517 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583293 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 15:24:01.583517 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.583306 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 15:24:01.584164 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.584145 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 15:24:01.584223 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.584168 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 15:24:01.586847 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.586825 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-48.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 19 15:24:01.586917 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.586863 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 15:24:01.586917 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.586880 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 15:24:01.587956 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.587941 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 15:24:01.588041 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.587981 2564 server.go:1295] "Started kubelet" Apr 19 15:24:01.588166 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.588074 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 15:24:01.588223 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.588208 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 15:24:01.588703 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.588624 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 15:24:01.589013 ip-10-0-131-48 systemd[1]: Started Kubernetes Kubelet. Apr 19 15:24:01.590596 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.590474 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 19 15:24:01.591894 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.591879 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 15:24:01.596310 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.596281 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 15:24:01.596725 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.596704 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 15:24:01.596822 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.596732 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 15:24:01.597478 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597457 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 15:24:01.597478 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597462 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 15:24:01.597603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597488 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 15:24:01.597603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597530 2564 factory.go:55] Registering systemd factory Apr 19 15:24:01.597603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597551 2564 factory.go:223] Registration of the systemd container factory successfully Apr 19 15:24:01.597603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597568 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 19 15:24:01.597603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597582 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 19 15:24:01.597819 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.597595 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:01.597819 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597787 2564 factory.go:153] Registering CRI-O factory Apr 19 15:24:01.597819 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597801 2564 factory.go:223] Registration of the crio container factory successfully Apr 19 15:24:01.597947 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597891 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 15:24:01.597947 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597916 2564 factory.go:103] Registering Raw factory Apr 19 15:24:01.597947 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.597933 2564 manager.go:1196] Started watching for new ooms in manager Apr 19 15:24:01.598313 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.598300 2564 manager.go:319] Starting recovery of all containers Apr 19 15:24:01.603388 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.603356 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 19 15:24:01.603509 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.603487 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 19 15:24:01.603649 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.603616 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ck9lj" Apr 19 15:24:01.604367 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.603357 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-48.ec2.internal.18a7cb6bbb75e2f5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-48.ec2.internal,UID:ip-10-0-131-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-48.ec2.internal,},FirstTimestamp:2026-04-19 15:24:01.587954421 +0000 UTC m=+0.387869978,LastTimestamp:2026-04-19 15:24:01.587954421 +0000 UTC m=+0.387869978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-48.ec2.internal,}" Apr 19 15:24:01.607069 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.607037 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 15:24:01.611391 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.611369 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ck9lj" Apr 19 15:24:01.612541 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.612521 2564 manager.go:324] Recovery completed Apr 19 15:24:01.617332 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.617319 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.619861 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.619845 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.619922 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.619876 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.619922 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.619891 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.620430 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.620414 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 15:24:01.620430 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.620427 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 15:24:01.620529 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.620442 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 19 15:24:01.621784 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.621707 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-48.ec2.internal.18a7cb6bbd5cbf12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-48.ec2.internal,UID:ip-10-0-131-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-48.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-48.ec2.internal,},FirstTimestamp:2026-04-19 15:24:01.619861266 +0000 UTC m=+0.419776819,LastTimestamp:2026-04-19 15:24:01.619861266 +0000 UTC m=+0.419776819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-48.ec2.internal,}" Apr 19 15:24:01.623497 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.623482 2564 policy_none.go:49] "None policy: Start" Apr 19 15:24:01.623559 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.623505 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 15:24:01.623559 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.623520 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.661389 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.661419 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.661442 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.661449 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.661483 2564 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665226 2564 manager.go:341] "Starting Device Plugin manager" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.665267 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665280 2564 server.go:85] "Starting device plugin registration server" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665315 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665538 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665550 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665657 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665730 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.665739 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.666244 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 15:24:01.675985 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.666279 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:01.762623 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.762529 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal"] Apr 19 15:24:01.762772 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.762663 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.764756 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.764736 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.764866 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.764767 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.764866 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.764781 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.765823 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.765809 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.766482 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.766461 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.766524 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.766487 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.766524 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.766498 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.766524 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.766518 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.767106 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767093 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.767220 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767207 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.767261 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767234 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.767775 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767760 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.767775 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767767 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.767883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767789 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.767883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767806 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.767883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767791 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.767883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.767872 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.770142 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.770126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.770190 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.770159 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:24:01.770797 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.770783 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:24:01.770866 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.770806 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:24:01.770866 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.770817 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:24:01.772627 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.772611 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.772718 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.772643 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-48.ec2.internal\": node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:01.787182 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.787165 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-48.ec2.internal\" not found" node="ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.790083 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.790068 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:01.790614 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.790595 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-48.ec2.internal\" not found" node="ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.797913 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.797896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.798000 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.797925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.798000 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.797951 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b98e067c43f6f3381b105d98ca711c33-config\") pod \"kube-apiserver-proxy-ip-10-0-131-48.ec2.internal\" (UID: \"b98e067c43f6f3381b105d98ca711c33\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.890693 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.890657 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:01.898955 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.898927 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.899021 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.898961 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b98e067c43f6f3381b105d98ca711c33-config\") pod \"kube-apiserver-proxy-ip-10-0-131-48.ec2.internal\" (UID: \"b98e067c43f6f3381b105d98ca711c33\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.899021 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.898980 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.899089 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.899018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.899089 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.899033 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/576754544584064ebee6a3db50a5f93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal\" (UID: \"576754544584064ebee6a3db50a5f93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.899089 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:01.899037 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b98e067c43f6f3381b105d98ca711c33-config\") pod \"kube-apiserver-proxy-ip-10-0-131-48.ec2.internal\" (UID: \"b98e067c43f6f3381b105d98ca711c33\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:01.991140 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:01.991095 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.088935 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.088852 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:02.091197 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.091171 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.093369 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.093347 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:02.192217 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.192175 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.292780 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.292747 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.393274 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.393208 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.493761 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.493736 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.504369 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.504340 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:02.512138 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.512123 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 15:24:02.512315 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.512293 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 15:24:02.512420 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.512326 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 15:24:02.594520 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:02.594489 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-48.ec2.internal\" not found" Apr 19 15:24:02.596656 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.596639 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 15:24:02.609831 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.609809 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 15:24:02.614110 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.614080 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 15:19:01 +0000 UTC" deadline="2027-09-23 22:28:22.893986939 +0000 UTC" Apr 19 15:24:02.614110 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.614104 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12535h4m20.279885222s" Apr 19 15:24:02.633444 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.633421 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n4mdl" Apr 19 15:24:02.641843 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.641825 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n4mdl" Apr 19 15:24:02.642619 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.642604 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:02.697515 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.697483 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" Apr 19 15:24:02.712117 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.712098 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 15:24:02.713461 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.713449 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" Apr 19 15:24:02.719475 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.719455 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 15:24:02.776031 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:02.776003 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98e067c43f6f3381b105d98ca711c33.slice/crio-bbadb523115f38844a9a8fe1631597d91cf461fb697a55057f6fd05060a912e6 WatchSource:0}: Error finding container bbadb523115f38844a9a8fe1631597d91cf461fb697a55057f6fd05060a912e6: Status 404 returned error can't find the container with id bbadb523115f38844a9a8fe1631597d91cf461fb697a55057f6fd05060a912e6 Apr 19 15:24:02.776409 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:02.776388 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576754544584064ebee6a3db50a5f93e.slice/crio-e466831d5e197590147970036b6d5af642ad234cd9ba208629f84690486f3436 WatchSource:0}: Error finding container e466831d5e197590147970036b6d5af642ad234cd9ba208629f84690486f3436: Status 404 returned error can't find the container with id e466831d5e197590147970036b6d5af642ad234cd9ba208629f84690486f3436 Apr 19 15:24:02.781066 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:02.780810 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:24:03.134481 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.134392 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:03.446454 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.446221 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:03.577884 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.577846 2564 apiserver.go:52] "Watching apiserver" Apr 19 15:24:03.584353 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.584330 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 15:24:03.589379 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.589354 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-fsfjs","kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5nvtf","openshift-image-registry/node-ca-d6jcn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal","openshift-multus/multus-additional-cni-plugins-57hmj","openshift-multus/network-metrics-daemon-6gs28","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k","openshift-dns/node-resolver-9wldk","openshift-multus/multus-2jvjq","openshift-network-diagnostics/network-check-target-g9dx5","openshift-network-operator/iptables-alerter-p4lxv","openshift-ovn-kubernetes/ovnkube-node-pwk8b"] Apr 19 15:24:03.592393 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.592372 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.594557 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.594532 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.595234 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.595207 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.595343 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.595269 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.595343 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.595307 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 15:24:03.595606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.595588 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 15:24:03.595696 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.595598 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lpvg\"" Apr 19 15:24:03.598603 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.598584 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mzwnd\"" Apr 19 15:24:03.598997 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.598955 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.598997 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.598969 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.599129 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.599014 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.599129 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.599021 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:03.601834 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.601816 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.603980 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.603960 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l6tp9\"" Apr 19 15:24:03.604075 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.604025 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.604133 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.604098 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 15:24:03.604513 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.604491 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 15:24:03.606849 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.606818 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 15:24:03.607019 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.606999 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 15:24:03.607204 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.607004 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k824p\"" Apr 19 15:24:03.607555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.607471 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.607681 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.607587 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-netns\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.607737 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.607674 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-multus-certs\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.607790 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.607755 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.608518 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608496 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.608594 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608537 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-socket-dir-parent\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.608594 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608564 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-kubernetes\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.608728 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608611 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-lib-modules\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.608728 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608661 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.608728 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608690 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6cx\" (UniqueName: \"kubernetes.io/projected/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-kube-api-access-lw6cx\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.608882 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608731 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-os-release\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.608882 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608767 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-multus\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.608882 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608794 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-daemon-config\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.608882 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608838 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-conf\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.608882 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608870 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-etc-tuned\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608897 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-system-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608922 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-k8s-cni-cncf-io\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608950 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608956 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74dt\" (UniqueName: \"kubernetes.io/projected/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-kube-api-access-v74dt\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.608979 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-run\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.609008 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:03.609112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609063 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cnibin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609155 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609180 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-modprobe-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609205 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysconfig\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609233 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plj7x\" (UniqueName: \"kubernetes.io/projected/f4487f66-c637-425c-a304-b53a5a1d6b25-kube-api-access-plj7x\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609262 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-hostroot\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-conf-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-os-release\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609334 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-bin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-kubelet\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609379 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-sys\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.609428 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609406 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7hv\" (UniqueName: \"kubernetes.io/projected/79f200cd-af39-4394-9bd6-10c3058b2139-kube-api-access-kr7hv\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609439 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-system-cni-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609469 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-binary-copy\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609508 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609553 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609590 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cni-binary-copy\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609617 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-etc-kubernetes\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609655 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-systemd\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609704 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-var-lib-kubelet\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-host\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609785 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-tmp\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.610012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.609819 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-cnibin\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.611666 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.611363 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.611666 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.611436 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-grs4d\"" Apr 19 15:24:03.611666 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.611473 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 15:24:03.611805 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.611713 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.614298 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.613863 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.616232 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616211 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.616232 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616234 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 15:24:03.616479 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616289 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.616479 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616313 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 15:24:03.616807 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616782 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qwwqc\"" Apr 19 15:24:03.616980 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.616965 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 15:24:03.617270 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.617170 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.617270 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.617219 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 15:24:03.617393 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.617290 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.618494 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618478 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.618596 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618566 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.618852 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618832 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lqrwl\"" Apr 19 15:24:03.618941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618836 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5w29k\"" Apr 19 15:24:03.618941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618835 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 15:24:03.618941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.618838 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.619293 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.619224 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.619293 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.619276 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 15:24:03.619449 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.619278 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.620742 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.620725 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 15:24:03.620919 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.620900 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x84qs\"" Apr 19 15:24:03.620987 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.620935 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 15:24:03.643338 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.643305 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 15:19:02 +0000 UTC" deadline="2027-12-20 15:50:14.248034327 +0000 UTC" Apr 19 15:24:03.643433 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.643337 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14640h26m10.604702669s" Apr 19 15:24:03.670088 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.670041 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" event={"ID":"576754544584064ebee6a3db50a5f93e","Type":"ContainerStarted","Data":"e466831d5e197590147970036b6d5af642ad234cd9ba208629f84690486f3436"} Apr 19 15:24:03.671291 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.671262 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" event={"ID":"b98e067c43f6f3381b105d98ca711c33","Type":"ContainerStarted","Data":"bbadb523115f38844a9a8fe1631597d91cf461fb697a55057f6fd05060a912e6"} Apr 19 15:24:03.699253 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.699195 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 15:24:03.710296 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710269 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-etc-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.710416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710306 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.710416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-socket-dir-parent\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.710416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-lib-modules\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.710569 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710442 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-socket-dir-parent\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.710569 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6cx\" (UniqueName: \"kubernetes.io/projected/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-kube-api-access-lw6cx\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.710686 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710576 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-lib-modules\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.710686 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710611 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6381004-4b02-4afb-9f2a-99502a796e5a-host-slash\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.710686 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710672 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-netns\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710693 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-node-log\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710707 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-bin\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710740 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-etc-tuned\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710768 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-sys-fs\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710794 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6vk\" (UniqueName: \"kubernetes.io/projected/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-kube-api-access-lb6vk\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.710826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710822 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-systemd-units\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710848 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-system-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710873 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v74dt\" (UniqueName: \"kubernetes.io/projected/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-kube-api-access-v74dt\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710898 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-run\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710951 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/848efa24-3ed5-46b7-b923-74011caa024a-agent-certs\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710965 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-run\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.710965 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-system-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711006 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a6381004-4b02-4afb-9f2a-99502a796e5a-iptables-alerter-script\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.711107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711041 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-log-socket\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711114 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-netd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711138 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-script-lib\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711154 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711183 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-modprobe-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711209 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysconfig\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711234 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plj7x\" (UniqueName: \"kubernetes.io/projected/f4487f66-c637-425c-a304-b53a5a1d6b25-kube-api-access-plj7x\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711260 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-serviceca\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711282 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-kubelet\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711304 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-systemd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711328 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-modprobe-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-conf-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711364 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-slash\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711390 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-env-overrides\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711392 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysconfig\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-kubelet\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.711451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711438 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7hv\" (UniqueName: \"kubernetes.io/projected/79f200cd-af39-4394-9bd6-10c3058b2139-kube-api-access-kr7hv\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-conf-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711467 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711498 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711547 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-registration-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711588 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-hosts-file\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711617 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cni-binary-copy\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711660 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-etc-kubernetes\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711686 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-host\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-etc-kubernetes\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711777 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-tmp\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-cnibin\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711829 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grw8s\" (UniqueName: \"kubernetes.io/projected/80d5a93a-4613-43be-aa6a-427108e9e36e-kube-api-access-grw8s\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711837 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-kubelet\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711858 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-ovn\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711893 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-host\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711934 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.712211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.711973 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-config\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712003 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712071 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-kubernetes\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712108 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-os-release\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712140 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712211 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-multus\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712155 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-multus\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712247 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-d\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712269 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-kubernetes\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-cnibin\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712303 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-daemon-config\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712309 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-os-release\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712432 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712438 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cni-binary-copy\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712490 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-conf\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712531 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-socket-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712556 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-var-lib-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.713009 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-k8s-cni-cncf-io\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712673 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58g4c\" (UniqueName: \"kubernetes.io/projected/a6381004-4b02-4afb-9f2a-99502a796e5a-kube-api-access-58g4c\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-k8s-cni-cncf-io\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712701 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-sysctl-conf\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712737 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-daemon-config\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712736 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712789 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cnibin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712819 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712858 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-cnibin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712904 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712932 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovn-node-metrics-cert\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-hostroot\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.712989 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-os-release\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713015 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zws8\" (UniqueName: \"kubernetes.io/projected/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-kube-api-access-6zws8\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713028 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-hostroot\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713058 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-device-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/848efa24-3ed5-46b7-b923-74011caa024a-konnectivity-ca\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.713780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj27\" (UniqueName: \"kubernetes.io/projected/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-kube-api-access-2xj27\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713105 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-os-release\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713140 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-bin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713171 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-sys\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713198 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-system-cni-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-var-lib-cni-bin\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713237 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-binary-copy\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713240 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-sys\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4487f66-c637-425c-a304-b53a5a1d6b25-system-cni-dir\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713314 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713346 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-tmp-dir\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713422 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-systemd\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.713430 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.713512 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:04.213480401 +0000 UTC m=+3.013395955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713522 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-etc-systemd\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713612 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-var-lib-kubelet\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713675 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.714555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713693 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79f200cd-af39-4394-9bd6-10c3058b2139-var-lib-kubelet\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-host\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713743 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-netns\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713811 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-multus-certs\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-multus-certs\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713883 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-multus-cni-dir\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713881 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-host-run-netns\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.713903 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-binary-copy\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.715326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.714140 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4487f66-c637-425c-a304-b53a5a1d6b25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.715806 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.715583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-tmp\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.715806 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.715670 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/79f200cd-af39-4394-9bd6-10c3058b2139-etc-tuned\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.732535 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.732506 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6cx\" (UniqueName: \"kubernetes.io/projected/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-kube-api-access-lw6cx\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:03.732622 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.732509 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74dt\" (UniqueName: \"kubernetes.io/projected/a265fc76-7ac3-4ce6-b6a9-f1f988b751d5-kube-api-access-v74dt\") pod \"multus-2jvjq\" (UID: \"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5\") " pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.733162 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.733141 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7hv\" (UniqueName: \"kubernetes.io/projected/79f200cd-af39-4394-9bd6-10c3058b2139-kube-api-access-kr7hv\") pod \"tuned-5nvtf\" (UID: \"79f200cd-af39-4394-9bd6-10c3058b2139\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.733253 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.733236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plj7x\" (UniqueName: \"kubernetes.io/projected/f4487f66-c637-425c-a304-b53a5a1d6b25-kube-api-access-plj7x\") pod \"multus-additional-cni-plugins-57hmj\" (UID: \"f4487f66-c637-425c-a304-b53a5a1d6b25\") " pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.814927 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.814897 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-config\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.814936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.814963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-socket-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.814987 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-var-lib-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815011 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58g4c\" (UniqueName: \"kubernetes.io/projected/a6381004-4b02-4afb-9f2a-99502a796e5a-kube-api-access-58g4c\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815026 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815064 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815048 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815072 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815096 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovn-node-metrics-cert\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-var-lib-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815122 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zws8\" (UniqueName: \"kubernetes.io/projected/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-kube-api-access-6zws8\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-socket-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815116 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815146 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-device-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815202 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/848efa24-3ed5-46b7-b923-74011caa024a-konnectivity-ca\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815207 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815310 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-device-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815314 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815323 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xj27\" (UniqueName: \"kubernetes.io/projected/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-kube-api-access-2xj27\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815380 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-tmp-dir\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.815416 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815416 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-host\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815530 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-host\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-etc-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815573 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815653 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-etc-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815617 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6381004-4b02-4afb-9f2a-99502a796e5a-host-slash\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815677 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-openvswitch\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815701 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-netns\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815716 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6381004-4b02-4afb-9f2a-99502a796e5a-host-slash\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815722 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-node-log\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815761 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-bin\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815767 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-node-log\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-sys-fs\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815822 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6vk\" (UniqueName: \"kubernetes.io/projected/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-kube-api-access-lb6vk\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815840 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-bin\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.816238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815795 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-run-netns\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-systemd-units\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815849 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-sys-fs\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815884 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-systemd-units\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/848efa24-3ed5-46b7-b923-74011caa024a-agent-certs\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815916 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a6381004-4b02-4afb-9f2a-99502a796e5a-iptables-alerter-script\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-log-socket\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815950 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-netd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815972 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-script-lib\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.815991 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-log-socket\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816007 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-serviceca\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816031 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-cni-netd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-kubelet\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816064 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-systemd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-slash\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816113 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-env-overrides\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816150 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-registration-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.817018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816164 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-systemd\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-hosts-file\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816193 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-slash\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grw8s\" (UniqueName: \"kubernetes.io/projected/80d5a93a-4613-43be-aa6a-427108e9e36e-kube-api-access-grw8s\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-ovn\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80d5a93a-4613-43be-aa6a-427108e9e36e-registration-dir\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816274 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-hosts-file\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-tmp-dir\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/848efa24-3ed5-46b7-b923-74011caa024a-konnectivity-ca\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-run-ovn\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816378 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-host-kubelet\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816595 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-config\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816862 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovnkube-script-lib\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816883 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a6381004-4b02-4afb-9f2a-99502a796e5a-iptables-alerter-script\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816865 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-serviceca\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.817727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.816960 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-env-overrides\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.818181 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.817791 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-ovn-node-metrics-cert\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.818446 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.818422 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/848efa24-3ed5-46b7-b923-74011caa024a-agent-certs\") pod \"konnectivity-agent-fsfjs\" (UID: \"848efa24-3ed5-46b7-b923-74011caa024a\") " pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.821175 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.821156 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:03.821248 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.821179 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:03.821248 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.821195 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:03.821344 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:03.821253 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:04.321235332 +0000 UTC m=+3.121150872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:03.823465 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.823398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zws8\" (UniqueName: \"kubernetes.io/projected/e1bc6add-6e54-4e69-ae6d-573e33e1dc7a-kube-api-access-6zws8\") pod \"node-ca-d6jcn\" (UID: \"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a\") " pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.823465 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.823399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58g4c\" (UniqueName: \"kubernetes.io/projected/a6381004-4b02-4afb-9f2a-99502a796e5a-kube-api-access-58g4c\") pod \"iptables-alerter-p4lxv\" (UID: \"a6381004-4b02-4afb-9f2a-99502a796e5a\") " pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.823789 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.823767 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grw8s\" (UniqueName: \"kubernetes.io/projected/80d5a93a-4613-43be-aa6a-427108e9e36e-kube-api-access-grw8s\") pod \"aws-ebs-csi-driver-node-lkb7k\" (UID: \"80d5a93a-4613-43be-aa6a-427108e9e36e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.824066 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.824040 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6vk\" (UniqueName: \"kubernetes.io/projected/e251a3a4-2359-4dcf-90f8-20b9a43b9aa4-kube-api-access-lb6vk\") pod \"node-resolver-9wldk\" (UID: \"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4\") " pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:03.824352 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.824335 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xj27\" (UniqueName: \"kubernetes.io/projected/cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3-kube-api-access-2xj27\") pod \"ovnkube-node-pwk8b\" (UID: \"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.903731 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.903699 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jvjq" Apr 19 15:24:03.914469 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.914444 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" Apr 19 15:24:03.929471 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.929442 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57hmj" Apr 19 15:24:03.938117 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.938086 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:03.946670 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.946652 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p4lxv" Apr 19 15:24:03.953325 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.953272 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:03.960877 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.960860 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d6jcn" Apr 19 15:24:03.968364 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.968347 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" Apr 19 15:24:03.976864 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:03.976849 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wldk" Apr 19 15:24:04.219474 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:04.219397 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:04.219646 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.219570 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:04.219695 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.219667 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:05.219649979 +0000 UTC m=+4.019565535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:04.360088 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:04.360055 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:24:04.420485 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:04.420448 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:04.420683 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.420654 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:04.420683 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.420678 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:04.420818 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.420693 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:04.420818 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:04.420806 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:05.420785903 +0000 UTC m=+4.220701447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:04.643556 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:04.643492 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 15:19:02 +0000 UTC" deadline="2028-01-22 14:37:35.007867466 +0000 UTC" Apr 19 15:24:04.643556 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:04.643521 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15431h13m30.364348899s" Apr 19 15:24:04.688113 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.687911 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda265fc76_7ac3_4ce6_b6a9_f1f988b751d5.slice/crio-691a24597d6d326ebd6f6ce54d642c389eea4b420afd52170147fa26e14e2c05 WatchSource:0}: Error finding container 691a24597d6d326ebd6f6ce54d642c389eea4b420afd52170147fa26e14e2c05: Status 404 returned error can't find the container with id 691a24597d6d326ebd6f6ce54d642c389eea4b420afd52170147fa26e14e2c05 Apr 19 15:24:04.689771 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.689600 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d5a93a_4613_43be_aa6a_427108e9e36e.slice/crio-9aa2fe5f6c1d89272e2493c9dc68f38885ca535bb5926d8632c9e55cb529beca WatchSource:0}: Error finding container 9aa2fe5f6c1d89272e2493c9dc68f38885ca535bb5926d8632c9e55cb529beca: Status 404 returned error can't find the container with id 9aa2fe5f6c1d89272e2493c9dc68f38885ca535bb5926d8632c9e55cb529beca Apr 19 15:24:04.692485 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.691442 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848efa24_3ed5_46b7_b923_74011caa024a.slice/crio-7b3cd1ecb9db31fc13bd54818acfe0915cd22569f17793dc1375b7ee27ab23ef WatchSource:0}: Error finding container 7b3cd1ecb9db31fc13bd54818acfe0915cd22569f17793dc1375b7ee27ab23ef: Status 404 returned error can't find the container with id 7b3cd1ecb9db31fc13bd54818acfe0915cd22569f17793dc1375b7ee27ab23ef Apr 19 15:24:04.693510 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.693483 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bc6add_6e54_4e69_ae6d_573e33e1dc7a.slice/crio-3ca72137c2db6ec0d7fe14deb765231607775dcbf33c14b6d6c4ef75af6f1516 WatchSource:0}: Error finding container 3ca72137c2db6ec0d7fe14deb765231607775dcbf33c14b6d6c4ef75af6f1516: Status 404 returned error can't find the container with id 3ca72137c2db6ec0d7fe14deb765231607775dcbf33c14b6d6c4ef75af6f1516 Apr 19 15:24:04.694524 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.694308 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4487f66_c637_425c_a304_b53a5a1d6b25.slice/crio-72e6a621cdf7e862450638f89f21b8420f589f1fa283fef97d54045d5dc24b24 WatchSource:0}: Error finding container 72e6a621cdf7e862450638f89f21b8420f589f1fa283fef97d54045d5dc24b24: Status 404 returned error can't find the container with id 72e6a621cdf7e862450638f89f21b8420f589f1fa283fef97d54045d5dc24b24 Apr 19 15:24:04.695687 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.695668 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode251a3a4_2359_4dcf_90f8_20b9a43b9aa4.slice/crio-dd20a29b7cac0893810dfc8c79df04acc1929c345ace1677356ccd6a38e48a5e WatchSource:0}: Error finding container dd20a29b7cac0893810dfc8c79df04acc1929c345ace1677356ccd6a38e48a5e: Status 404 returned error can't find the container with id dd20a29b7cac0893810dfc8c79df04acc1929c345ace1677356ccd6a38e48a5e Apr 19 15:24:04.696530 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.696506 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f200cd_af39_4394_9bd6_10c3058b2139.slice/crio-538a1e7078799dc327478d0b0d7ea56b2063be88fe5c4d885705cdc12f5ea630 WatchSource:0}: Error finding container 538a1e7078799dc327478d0b0d7ea56b2063be88fe5c4d885705cdc12f5ea630: Status 404 returned error can't find the container with id 538a1e7078799dc327478d0b0d7ea56b2063be88fe5c4d885705cdc12f5ea630 Apr 19 15:24:04.698150 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:04.697686 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa9f4ef_4f5b_4af5_84b7_bfeb2ca1baf3.slice/crio-b574f7f5a17b351f910584578296161afa7bab41324f35411917111bab9e5c13 WatchSource:0}: Error finding container b574f7f5a17b351f910584578296161afa7bab41324f35411917111bab9e5c13: Status 404 returned error can't find the container with id b574f7f5a17b351f910584578296161afa7bab41324f35411917111bab9e5c13 Apr 19 15:24:05.225316 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.225286 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:05.225469 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.225391 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:05.225469 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.225438 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:07.225426125 +0000 UTC m=+6.025341662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:05.428046 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.427389 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:05.428046 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.427536 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:05.428046 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.427555 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:05.428046 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.427567 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:05.428046 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.427623 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:07.42760514 +0000 UTC m=+6.227520694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:05.662049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.661966 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:05.662049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.661998 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:05.662536 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.662109 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:05.662536 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:05.662499 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:05.694766 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.694688 2564 generic.go:358] "Generic (PLEG): container finished" podID="576754544584064ebee6a3db50a5f93e" containerID="ac6a4810e618b4067598e7ac9c23e0968a94da9b446442a1a2e32ad5aace9361" exitCode=0 Apr 19 15:24:05.694905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.694764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" event={"ID":"576754544584064ebee6a3db50a5f93e","Type":"ContainerDied","Data":"ac6a4810e618b4067598e7ac9c23e0968a94da9b446442a1a2e32ad5aace9361"} Apr 19 15:24:05.705097 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.705028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p4lxv" event={"ID":"a6381004-4b02-4afb-9f2a-99502a796e5a","Type":"ContainerStarted","Data":"ebd07d0a9e0daaff720523c7e88b4b307ed9e0415b8f2b15be5aaa52f0b2a4a9"} Apr 19 15:24:05.723596 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.723534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"b574f7f5a17b351f910584578296161afa7bab41324f35411917111bab9e5c13"} Apr 19 15:24:05.737387 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.737322 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" event={"ID":"79f200cd-af39-4394-9bd6-10c3058b2139","Type":"ContainerStarted","Data":"538a1e7078799dc327478d0b0d7ea56b2063be88fe5c4d885705cdc12f5ea630"} Apr 19 15:24:05.748695 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.748665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fsfjs" event={"ID":"848efa24-3ed5-46b7-b923-74011caa024a","Type":"ContainerStarted","Data":"7b3cd1ecb9db31fc13bd54818acfe0915cd22569f17793dc1375b7ee27ab23ef"} Apr 19 15:24:05.762075 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.762044 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" event={"ID":"80d5a93a-4613-43be-aa6a-427108e9e36e","Type":"ContainerStarted","Data":"9aa2fe5f6c1d89272e2493c9dc68f38885ca535bb5926d8632c9e55cb529beca"} Apr 19 15:24:05.767508 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.767480 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jvjq" event={"ID":"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5","Type":"ContainerStarted","Data":"691a24597d6d326ebd6f6ce54d642c389eea4b420afd52170147fa26e14e2c05"} Apr 19 15:24:05.778861 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.778835 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" event={"ID":"b98e067c43f6f3381b105d98ca711c33","Type":"ContainerStarted","Data":"8112d8d28bcca10b14a7ff682720ba0643ace2a771a077adb6640f7aa1fbb7c9"} Apr 19 15:24:05.791709 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.791651 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-48.ec2.internal" podStartSLOduration=3.791618845 podStartE2EDuration="3.791618845s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:24:05.791239622 +0000 UTC m=+4.591155181" watchObservedRunningTime="2026-04-19 15:24:05.791618845 +0000 UTC m=+4.591534405" Apr 19 15:24:05.807977 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.807946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wldk" event={"ID":"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4","Type":"ContainerStarted","Data":"dd20a29b7cac0893810dfc8c79df04acc1929c345ace1677356ccd6a38e48a5e"} Apr 19 15:24:05.817399 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.817151 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerStarted","Data":"72e6a621cdf7e862450638f89f21b8420f589f1fa283fef97d54045d5dc24b24"} Apr 19 15:24:05.833133 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:05.832961 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d6jcn" event={"ID":"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a","Type":"ContainerStarted","Data":"3ca72137c2db6ec0d7fe14deb765231607775dcbf33c14b6d6c4ef75af6f1516"} Apr 19 15:24:06.863929 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:06.863764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" event={"ID":"576754544584064ebee6a3db50a5f93e","Type":"ContainerStarted","Data":"04bf7b72064baf562eeebd86647e28d1364702ca1fc4327a75d6b7a14cf94b2d"} Apr 19 15:24:07.244789 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:07.244569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:07.244937 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.244781 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:07.244937 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.244861 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:11.244840848 +0000 UTC m=+10.044756389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:07.446514 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:07.446477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:07.446702 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.446654 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:07.446702 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.446675 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:07.446702 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.446688 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:07.446890 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.446746 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:11.446728604 +0000 UTC m=+10.246644158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:07.664572 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:07.662620 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:07.664572 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:07.662620 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:07.664572 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.662784 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:07.664572 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:07.662838 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:09.662648 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:09.662607 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:09.663080 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:09.662744 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:09.663080 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:09.662818 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:09.663080 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:09.662936 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:11.279053 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:11.279001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:11.279496 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.279142 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:11.279496 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.279215 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:19.279193244 +0000 UTC m=+18.079108786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:11.479984 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:11.479946 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:11.480160 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.480127 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:11.480160 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.480152 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:11.480287 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.480166 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:11.480287 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.480218 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:19.480202731 +0000 UTC m=+18.280118268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:11.663659 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:11.663562 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:11.663813 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.663706 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:11.664261 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:11.664093 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:11.664261 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:11.664214 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:13.662357 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:13.662323 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:13.662842 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:13.662462 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:13.662842 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:13.662511 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:13.662842 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:13.662621 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:15.662079 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:15.662046 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:15.662486 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:15.662151 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:15.662486 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:15.662207 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:15.662486 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:15.662315 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:17.661695 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:17.661659 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:17.662224 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:17.661784 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:17.662224 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:17.661845 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:17.662224 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:17.661977 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:19.343360 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:19.343322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:19.343767 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.343452 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:19.343767 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.343506 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:35.343491999 +0000 UTC m=+34.143407537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:24:19.544853 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:19.544819 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:19.545032 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.545011 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:19.545099 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.545042 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:19.545099 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.545058 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:19.545187 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.545123 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:35.545104547 +0000 UTC m=+34.345020102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:19.664618 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:19.664543 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:19.664778 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:19.664543 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:19.664778 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.664668 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:19.664778 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:19.664733 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:21.665674 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.665287 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:21.665674 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:21.665416 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:21.673132 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.670906 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:21.673132 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:21.671060 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:21.890406 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.890363 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"be264178bf98bf60718e42226c566e16426a1155dd63c7e98b150f7915d956b6"} Apr 19 15:24:21.890406 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.890400 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"68cdbfa4ed3bcf486fd7f60955c315b9f42b15a88c28555fd2a9ba1f66d26c0e"} Apr 19 15:24:21.891512 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.891436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" event={"ID":"79f200cd-af39-4394-9bd6-10c3058b2139","Type":"ContainerStarted","Data":"3f90ab12526dd22e2c0f1c732e70825085aeb338ca2a1044c4c358eed838b7e9"} Apr 19 15:24:21.892390 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.892366 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fsfjs" event={"ID":"848efa24-3ed5-46b7-b923-74011caa024a","Type":"ContainerStarted","Data":"ee6148ad659aefcd7912aabf5fde098deb30501f3d6a2cee4025534edfe77fbe"} Apr 19 15:24:21.893213 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.893189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" event={"ID":"80d5a93a-4613-43be-aa6a-427108e9e36e","Type":"ContainerStarted","Data":"9b99f0615ec79cc24277f02e97a34e0653cce3f2d5224307c78033e241cd1995"} Apr 19 15:24:21.894072 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.894041 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jvjq" event={"ID":"a265fc76-7ac3-4ce6-b6a9-f1f988b751d5","Type":"ContainerStarted","Data":"cb46a3d795f9564fa98eaef82c199749a771eb1fa933cc9933a4f1ead6c17cf2"} Apr 19 15:24:21.895543 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.895494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wldk" event={"ID":"e251a3a4-2359-4dcf-90f8-20b9a43b9aa4","Type":"ContainerStarted","Data":"e9f381d3673e2aa9c72bda053447941b47556da0efeb496fd4eaf25e97b479f5"} Apr 19 15:24:21.896734 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.896713 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerStarted","Data":"f70c8b359abcb3124f3563f716e0c0367191b0c92ecb4b401876c5352291db8c"} Apr 19 15:24:21.897917 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.897841 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d6jcn" event={"ID":"e1bc6add-6e54-4e69-ae6d-573e33e1dc7a","Type":"ContainerStarted","Data":"a578983eb18a5410bfecd86f452dfd5484663e9c9d874884c61fd938ef24846e"} Apr 19 15:24:21.917612 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.917338 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-48.ec2.internal" podStartSLOduration=19.917299376 podStartE2EDuration="19.917299376s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:24:06.878138453 +0000 UTC m=+5.678054013" watchObservedRunningTime="2026-04-19 15:24:21.917299376 +0000 UTC m=+20.717214930" Apr 19 15:24:21.918822 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.918225 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5nvtf" podStartSLOduration=4.070421067 podStartE2EDuration="20.918211158s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.700573657 +0000 UTC m=+3.500495341" lastFinishedPulling="2026-04-19 15:24:21.548369893 +0000 UTC m=+20.348285432" observedRunningTime="2026-04-19 15:24:21.917031279 +0000 UTC m=+20.716946841" watchObservedRunningTime="2026-04-19 15:24:21.918211158 +0000 UTC m=+20.718126724" Apr 19 15:24:21.933158 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.933121 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2jvjq" podStartSLOduration=3.948806962 podStartE2EDuration="20.933111027s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.689874528 +0000 UTC m=+3.489790079" lastFinishedPulling="2026-04-19 15:24:21.674178591 +0000 UTC m=+20.474094144" observedRunningTime="2026-04-19 15:24:21.932790857 +0000 UTC m=+20.732706417" watchObservedRunningTime="2026-04-19 15:24:21.933111027 +0000 UTC m=+20.733026585" Apr 19 15:24:21.974929 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.974878 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d6jcn" podStartSLOduration=3.132294746 podStartE2EDuration="19.974859226s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.695479074 +0000 UTC m=+3.495394615" lastFinishedPulling="2026-04-19 15:24:21.538043546 +0000 UTC m=+20.337959095" observedRunningTime="2026-04-19 15:24:21.974323685 +0000 UTC m=+20.774239283" watchObservedRunningTime="2026-04-19 15:24:21.974859226 +0000 UTC m=+20.774774787" Apr 19 15:24:21.990789 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:21.990733 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9wldk" podStartSLOduration=3.150447583 podStartE2EDuration="19.990714929s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.698155347 +0000 UTC m=+3.498070890" lastFinishedPulling="2026-04-19 15:24:21.538422693 +0000 UTC m=+20.338338236" observedRunningTime="2026-04-19 15:24:21.990148811 +0000 UTC m=+20.790064397" watchObservedRunningTime="2026-04-19 15:24:21.990714929 +0000 UTC m=+20.790630523" Apr 19 15:24:22.013012 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.012948 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fsfjs" podStartSLOduration=8.462946711 podStartE2EDuration="21.012930326s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.692965534 +0000 UTC m=+3.492881071" lastFinishedPulling="2026-04-19 15:24:17.24294914 +0000 UTC m=+16.042864686" observedRunningTime="2026-04-19 15:24:22.012261598 +0000 UTC m=+20.812177162" watchObservedRunningTime="2026-04-19 15:24:22.012930326 +0000 UTC m=+20.812845898" Apr 19 15:24:22.901427 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.901394 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"89e43c6f8cb6441f7e80258ff7a9f85e747c171c9114fce8904ac841dd455e3c"} Apr 19 15:24:22.901427 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.901430 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"52071912851c6af388780a600e619b2403cbb4643fe2ad926a3864f275273659"} Apr 19 15:24:22.902070 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.901440 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"885cd651c62e72e366baf91f8b9b8663fa9ad905a2d6098b23ec4e1dbab4fdcf"} Apr 19 15:24:22.902070 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.901451 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"e3df9b5f77c09004f9c9551eda8466b0d0fdf7ca3e2a33fd9a0226ff73edb96a"} Apr 19 15:24:22.902672 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.902651 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="f70c8b359abcb3124f3563f716e0c0367191b0c92ecb4b401876c5352291db8c" exitCode=0 Apr 19 15:24:22.902777 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.902746 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"f70c8b359abcb3124f3563f716e0c0367191b0c92ecb4b401876c5352291db8c"} Apr 19 15:24:22.989477 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:22.989449 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 15:24:23.662130 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.661903 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:23.662374 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:23.662249 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:23.662749 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.662726 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:23.662885 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:23.662831 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:23.674839 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.674737 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T15:24:22.98947185Z","UUID":"0736824b-22ef-469d-9f12-a4a2c3300051","Handler":null,"Name":"","Endpoint":""} Apr 19 15:24:23.676500 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.676477 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 15:24:23.676607 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.676506 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 15:24:23.905077 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.905047 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p4lxv" event={"ID":"a6381004-4b02-4afb-9f2a-99502a796e5a","Type":"ContainerStarted","Data":"9daece0bf113242e9a58338a36d245e2605b725c178d55102f78ade6aa3fa218"} Apr 19 15:24:23.906551 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.906527 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" event={"ID":"80d5a93a-4613-43be-aa6a-427108e9e36e","Type":"ContainerStarted","Data":"d2227d0ff0b604e33b428feb2d61e8cab5d0ab9ed57bc569b17cb51b3aaf34d6"} Apr 19 15:24:23.921059 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:23.921005 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p4lxv" podStartSLOduration=6.08366001 podStartE2EDuration="22.920990648s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.700867595 +0000 UTC m=+3.500783133" lastFinishedPulling="2026-04-19 15:24:21.538198218 +0000 UTC m=+20.338113771" observedRunningTime="2026-04-19 15:24:23.920357304 +0000 UTC m=+22.720272862" watchObservedRunningTime="2026-04-19 15:24:23.920990648 +0000 UTC m=+22.720906208" Apr 19 15:24:24.911212 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:24.911177 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"4ae4d3dd8c17e5434df7478a6653d37938813bf95c118b1603650b7a0a587194"} Apr 19 15:24:24.913209 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:24.913176 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" event={"ID":"80d5a93a-4613-43be-aa6a-427108e9e36e","Type":"ContainerStarted","Data":"6ae90739781ffbde1c77ddaccfb2ff67eb4370eecb4b70514d0a757e36fa641c"} Apr 19 15:24:24.929003 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:24.928959 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkb7k" podStartSLOduration=3.716488697 podStartE2EDuration="22.92894673s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.691779387 +0000 UTC m=+3.491694937" lastFinishedPulling="2026-04-19 15:24:23.904237419 +0000 UTC m=+22.704152970" observedRunningTime="2026-04-19 15:24:24.928666396 +0000 UTC m=+23.728581958" watchObservedRunningTime="2026-04-19 15:24:24.92894673 +0000 UTC m=+23.728862290" Apr 19 15:24:25.662277 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:25.662245 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:25.662277 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:25.662262 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:25.662506 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:25.662353 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:25.662506 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:25.662480 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:26.380710 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.380678 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:26.381656 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.381608 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:26.886514 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.886348 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-p2kvw"] Apr 19 15:24:26.889469 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.889454 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:26.889534 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:26.889518 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p2kvw" podUID="16c2a1f0-b512-4202-a476-b96d67bc2fcf" Apr 19 15:24:26.919873 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.919834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" event={"ID":"cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3","Type":"ContainerStarted","Data":"4633e6a3d0ac00ec6e648fded8139dfb42b3b3847aece0df8927c4f2e97bc28a"} Apr 19 15:24:26.921561 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.920906 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:26.921561 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.920965 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:26.921561 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.920991 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:26.921800 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.921740 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fsfjs" Apr 19 15:24:26.923497 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.921933 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:26.937856 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.937832 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:26.938382 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.938362 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:24:26.947595 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:26.947542 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" podStartSLOduration=8.913815508 podStartE2EDuration="25.947527448s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.699873035 +0000 UTC m=+3.499788577" lastFinishedPulling="2026-04-19 15:24:21.733584969 +0000 UTC m=+20.533500517" observedRunningTime="2026-04-19 15:24:26.945538198 +0000 UTC m=+25.745453797" watchObservedRunningTime="2026-04-19 15:24:26.947527448 +0000 UTC m=+25.747443008" Apr 19 15:24:27.001215 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.001182 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-dbus\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.001357 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.001291 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.001714 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.001409 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-kubelet-config\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.102807 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.102767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.102974 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.102854 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-kubelet-config\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.102974 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.102883 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:27.102974 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.102895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-dbus\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.103107 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.102977 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret podName:16c2a1f0-b512-4202-a476-b96d67bc2fcf nodeName:}" failed. No retries permitted until 2026-04-19 15:24:27.602958295 +0000 UTC m=+26.402873835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret") pod "global-pull-secret-syncer-p2kvw" (UID: "16c2a1f0-b512-4202-a476-b96d67bc2fcf") : object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:27.103107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.102997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-kubelet-config\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.103201 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.103138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16c2a1f0-b512-4202-a476-b96d67bc2fcf-dbus\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.608402 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.608363 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:27.608833 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.608530 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:27.608833 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.608602 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret podName:16c2a1f0-b512-4202-a476-b96d67bc2fcf nodeName:}" failed. No retries permitted until 2026-04-19 15:24:28.608583383 +0000 UTC m=+27.408498924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret") pod "global-pull-secret-syncer-p2kvw" (UID: "16c2a1f0-b512-4202-a476-b96d67bc2fcf") : object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:27.662232 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.662196 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:27.662415 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.662304 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:27.662415 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:27.662372 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:27.662526 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:27.662502 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:28.259245 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.259211 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p2kvw"] Apr 19 15:24:28.259448 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.259382 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:28.259510 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:28.259478 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p2kvw" podUID="16c2a1f0-b512-4202-a476-b96d67bc2fcf" Apr 19 15:24:28.262954 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.262928 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g9dx5"] Apr 19 15:24:28.263113 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.263018 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:28.263202 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:28.263106 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:28.263648 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.263611 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6gs28"] Apr 19 15:24:28.263747 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.263700 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:28.263803 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:28.263787 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:28.615553 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:28.615481 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:28.615920 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:28.615617 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:28.615920 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:28.615697 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret podName:16c2a1f0-b512-4202-a476-b96d67bc2fcf nodeName:}" failed. No retries permitted until 2026-04-19 15:24:30.615681552 +0000 UTC m=+29.415597088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret") pod "global-pull-secret-syncer-p2kvw" (UID: "16c2a1f0-b512-4202-a476-b96d67bc2fcf") : object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:29.662583 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:29.662548 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:29.662583 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:29.662580 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:29.663027 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:29.662681 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:29.663027 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:29.662798 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p2kvw" podUID="16c2a1f0-b512-4202-a476-b96d67bc2fcf" Apr 19 15:24:29.928221 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:29.928123 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="e46d2343cacdbee796ba8a2874427254bb41a73bf290333cac314c9969d21bb3" exitCode=0 Apr 19 15:24:29.928221 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:29.928177 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"e46d2343cacdbee796ba8a2874427254bb41a73bf290333cac314c9969d21bb3"} Apr 19 15:24:30.631868 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:30.631790 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:30.631992 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:30.631925 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:30.631992 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:30.631986 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret podName:16c2a1f0-b512-4202-a476-b96d67bc2fcf nodeName:}" failed. No retries permitted until 2026-04-19 15:24:34.631971009 +0000 UTC m=+33.431886550 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret") pod "global-pull-secret-syncer-p2kvw" (UID: "16c2a1f0-b512-4202-a476-b96d67bc2fcf") : object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:30.661794 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:30.661764 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:30.661949 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:30.661907 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:31.664919 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:31.664756 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:31.665287 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:31.664756 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:31.665287 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:31.664994 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p2kvw" podUID="16c2a1f0-b512-4202-a476-b96d67bc2fcf" Apr 19 15:24:31.665287 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:31.665054 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:31.933566 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:31.933485 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="fc9716d98b83f317c0c90f37eede68e4b61aadb88194aabc6f29ccefa88f15c1" exitCode=0 Apr 19 15:24:31.933566 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:31.933526 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"fc9716d98b83f317c0c90f37eede68e4b61aadb88194aabc6f29ccefa88f15c1"} Apr 19 15:24:32.662720 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:32.662689 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:32.662861 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:32.662836 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6gs28" podUID="4bc5acd5-37f1-4ca4-bb28-39e2f95194f9" Apr 19 15:24:33.665084 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:33.665055 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:33.665579 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:33.665055 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:33.665579 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:33.665144 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g9dx5" podUID="07623157-4c19-4793-880d-b21867ce44f7" Apr 19 15:24:33.665579 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:33.665222 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p2kvw" podUID="16c2a1f0-b512-4202-a476-b96d67bc2fcf" Apr 19 15:24:33.939340 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:33.939258 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="bd9aa71e19375876571f7875b583100689d87687468ccee2e037f80a203ed06a" exitCode=0 Apr 19 15:24:33.939340 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:33.939296 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"bd9aa71e19375876571f7875b583100689d87687468ccee2e037f80a203ed06a"} Apr 19 15:24:34.540203 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.540174 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-48.ec2.internal" event="NodeReady" Apr 19 15:24:34.540371 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.540313 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 15:24:34.572055 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.572020 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv"] Apr 19 15:24:34.599747 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.599721 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6"] Apr 19 15:24:34.599908 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.599851 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.602788 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.602515 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gp89g\"" Apr 19 15:24:34.602788 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.602546 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 19 15:24:34.602788 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.602588 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 19 15:24:34.602788 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.602615 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 19 15:24:34.602788 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.602546 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 19 15:24:34.624414 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.624394 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr"] Apr 19 15:24:34.624550 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.624534 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.627030 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.626984 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 19 15:24:34.627030 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.627025 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 19 15:24:34.627227 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.627211 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 19 15:24:34.627289 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.627235 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 19 15:24:34.646727 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.646700 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:24:34.646849 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.646832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.649883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.649867 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 19 15:24:34.663937 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.663913 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:34.664130 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.664097 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:34.664245 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.664182 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret podName:16c2a1f0-b512-4202-a476-b96d67bc2fcf nodeName:}" failed. No retries permitted until 2026-04-19 15:24:42.664157648 +0000 UTC m=+41.464073202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret") pod "global-pull-secret-syncer-p2kvw" (UID: "16c2a1f0-b512-4202-a476-b96d67bc2fcf") : object "kube-system"/"original-pull-secret" not registered Apr 19 15:24:34.665110 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.665093 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv"] Apr 19 15:24:34.665552 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.665113 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6"] Apr 19 15:24:34.665552 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.665124 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lnp5x"] Apr 19 15:24:34.665552 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.665239 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.667835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.667761 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 15:24:34.667835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.667774 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 15:24:34.668666 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.668647 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b94dq\"" Apr 19 15:24:34.668742 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.668728 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 15:24:34.674989 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.674967 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 15:24:34.685606 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.685586 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:34.685720 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.685681 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:34.686039 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.685586 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:24:34.686039 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.686021 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr"] Apr 19 15:24:34.686162 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.686041 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lnp5x"] Apr 19 15:24:34.688451 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.688360 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 15:24:34.688546 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.688368 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 15:24:34.688546 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.688500 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 15:24:34.689123 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.689094 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 15:24:34.689344 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.689329 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjn96\"" Apr 19 15:24:34.690158 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.690130 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zxl4j\"" Apr 19 15:24:34.694291 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.694117 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-frphc"] Apr 19 15:24:34.716686 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.716659 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-frphc"] Apr 19 15:24:34.716810 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.716740 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.719292 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.719273 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvb5z\"" Apr 19 15:24:34.719403 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.719347 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 15:24:34.719811 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.719793 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 15:24:34.764859 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.764825 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9r4\" (UniqueName: \"kubernetes.io/projected/f8d87387-5e0f-4caa-b9af-353b2855e997-kube-api-access-pk9r4\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.765049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.764875 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.764946 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.764987 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765016 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765041 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8d87387-5e0f-4caa-b9af-353b2855e997-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765129 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765172 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765187 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765203 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpcb\" (UniqueName: \"kubernetes.io/projected/d771e17e-8f69-4e6b-b939-24cade594a96-kube-api-access-hkpcb\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765230 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-tmp\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.765300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765292 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d771e17e-8f69-4e6b-b939-24cade594a96-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765312 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765342 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765375 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwgg\" (UniqueName: \"kubernetes.io/projected/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-kube-api-access-rdwgg\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765429 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765445 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.765657 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.765479 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8w5\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.866786 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866706 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d771e17e-8f69-4e6b-b939-24cade594a96-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.866786 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866761 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866793 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7577782b-d81a-428d-abba-4b2f85606b5a-config-volume\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866813 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866828 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7577782b-d81a-428d-abba-4b2f85606b5a-tmp-dir\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866886 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866929 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwgg\" (UniqueName: \"kubernetes.io/projected/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-kube-api-access-rdwgg\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.866982 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866965 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.866993 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867024 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvn6\" (UniqueName: \"kubernetes.io/projected/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-kube-api-access-4fvn6\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867050 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwbb\" (UniqueName: \"kubernetes.io/projected/7577782b-d81a-428d-abba-4b2f85606b5a-kube-api-access-2mwbb\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867082 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8w5\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867132 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9r4\" (UniqueName: \"kubernetes.io/projected/f8d87387-5e0f-4caa-b9af-353b2855e997-kube-api-access-pk9r4\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.867202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867199 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867258 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867287 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867312 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867338 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867381 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8d87387-5e0f-4caa-b9af-353b2855e997-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867412 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867445 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d771e17e-8f69-4e6b-b939-24cade594a96-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867458 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867460 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867483 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867510 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkpcb\" (UniqueName: \"kubernetes.io/projected/d771e17e-8f69-4e6b-b939-24cade594a96-kube-api-access-hkpcb\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.867568 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867534 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-tmp\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.867680 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.867696 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.867752 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:24:35.36773399 +0000 UTC m=+34.167649746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.867754 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.868074 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-tmp\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.868555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.868326 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.872929 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.872810 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-ca\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.872929 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.872894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.872929 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.872899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.873157 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.872923 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.873157 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.873024 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d771e17e-8f69-4e6b-b939-24cade594a96-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.873157 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.873100 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8d87387-5e0f-4caa-b9af-353b2855e997-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.873607 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.873582 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.873744 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.873711 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.876120 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.875973 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwgg\" (UniqueName: \"kubernetes.io/projected/abdd09d7-7736-4592-ac16-cc3af5cd3b0c-kube-api-access-rdwgg\") pod \"klusterlet-addon-workmgr-7cc95cddf6-5g7rr\" (UID: \"abdd09d7-7736-4592-ac16-cc3af5cd3b0c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.876932 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.876908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8w5\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.877070 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.877049 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkpcb\" (UniqueName: \"kubernetes.io/projected/d771e17e-8f69-4e6b-b939-24cade594a96-kube-api-access-hkpcb\") pod \"cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6\" (UID: \"d771e17e-8f69-4e6b-b939-24cade594a96\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.877896 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.877877 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:34.878697 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.878668 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9r4\" (UniqueName: \"kubernetes.io/projected/f8d87387-5e0f-4caa-b9af-353b2855e997-kube-api-access-pk9r4\") pod \"managed-serviceaccount-addon-agent-d686c5c7-8g4hv\" (UID: \"f8d87387-5e0f-4caa-b9af-353b2855e997\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.919104 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.919079 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" Apr 19 15:24:34.933152 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.933129 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:24:34.968219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7577782b-d81a-428d-abba-4b2f85606b5a-config-volume\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.968383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968230 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.968383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7577782b-d81a-428d-abba-4b2f85606b5a-tmp-dir\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.968383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968295 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvn6\" (UniqueName: \"kubernetes.io/projected/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-kube-api-access-4fvn6\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:34.968383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwbb\" (UniqueName: \"kubernetes.io/projected/7577782b-d81a-428d-abba-4b2f85606b5a-kube-api-access-2mwbb\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.968383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.968368 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:34.968656 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.968561 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:34.968656 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.968620 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:35.468602162 +0000 UTC m=+34.268517703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:34.969316 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.969009 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:34.969316 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.969077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7577782b-d81a-428d-abba-4b2f85606b5a-tmp-dir\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.969316 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:34.969102 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:24:35.469082708 +0000 UTC m=+34.268998263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:34.969487 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.969462 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7577782b-d81a-428d-abba-4b2f85606b5a-config-volume\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.972434 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.972254 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:34.978941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.978897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwbb\" (UniqueName: \"kubernetes.io/projected/7577782b-d81a-428d-abba-4b2f85606b5a-kube-api-access-2mwbb\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:34.978941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:34.978925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvn6\" (UniqueName: \"kubernetes.io/projected/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-kube-api-access-4fvn6\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:35.125215 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.124730 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6"] Apr 19 15:24:35.127537 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.127507 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv"] Apr 19 15:24:35.128761 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:35.128709 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd771e17e_8f69_4e6b_b939_24cade594a96.slice/crio-89f46e5b4a0935c6c8d0372a2e2c13bb0ac2daa3faf717667044521c14cb7980 WatchSource:0}: Error finding container 89f46e5b4a0935c6c8d0372a2e2c13bb0ac2daa3faf717667044521c14cb7980: Status 404 returned error can't find the container with id 89f46e5b4a0935c6c8d0372a2e2c13bb0ac2daa3faf717667044521c14cb7980 Apr 19 15:24:35.131977 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:35.131947 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d87387_5e0f_4caa_b9af_353b2855e997.slice/crio-ee0d0c101177cd45daa26663b2a802d57ffff30d58aea32faac6a06b27015276 WatchSource:0}: Error finding container ee0d0c101177cd45daa26663b2a802d57ffff30d58aea32faac6a06b27015276: Status 404 returned error can't find the container with id ee0d0c101177cd45daa26663b2a802d57ffff30d58aea32faac6a06b27015276 Apr 19 15:24:35.141544 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.141519 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr"] Apr 19 15:24:35.144600 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:35.144576 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabdd09d7_7736_4592_ac16_cc3af5cd3b0c.slice/crio-e827d81530026f3131038302a04bed08d2cc8a061b7847b828420e40fd03bc68 WatchSource:0}: Error finding container e827d81530026f3131038302a04bed08d2cc8a061b7847b828420e40fd03bc68: Status 404 returned error can't find the container with id e827d81530026f3131038302a04bed08d2cc8a061b7847b828420e40fd03bc68 Apr 19 15:24:35.375692 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.375598 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:24:35.375844 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.375693 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:35.375844 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.375765 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:35.375844 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.375785 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:35.375963 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.375848 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:24:36.375828556 +0000 UTC m=+35.175744097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:35.375963 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.375762 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 15:24:35.375963 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.375927 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:07.375909265 +0000 UTC m=+66.175824826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : secret "metrics-daemon-secret" not found Apr 19 15:24:35.477075 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.477033 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:35.477252 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.477150 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:35.477252 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.477196 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:35.477252 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.477247 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:35.477382 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.477259 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:36.477240616 +0000 UTC m=+35.277156166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:35.477382 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.477281 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:24:36.477271618 +0000 UTC m=+35.277187155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:35.577531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.577492 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:35.577727 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.577682 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:24:35.577727 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.577703 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:24:35.577727 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.577713 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2cjtd for pod openshift-network-diagnostics/network-check-target-g9dx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:35.577892 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:35.577778 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd podName:07623157-4c19-4793-880d-b21867ce44f7 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:07.577755939 +0000 UTC m=+66.377671478 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2cjtd" (UniqueName: "kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd") pod "network-check-target-g9dx5" (UID: "07623157-4c19-4793-880d-b21867ce44f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:24:35.662759 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.662677 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:24:35.662922 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.662867 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:35.665314 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.665289 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 15:24:35.665796 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.665332 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b2zbg\"" Apr 19 15:24:35.665796 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.665444 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 15:24:35.665796 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.665543 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 15:24:35.944621 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.944542 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" event={"ID":"abdd09d7-7736-4592-ac16-cc3af5cd3b0c","Type":"ContainerStarted","Data":"e827d81530026f3131038302a04bed08d2cc8a061b7847b828420e40fd03bc68"} Apr 19 15:24:35.946377 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.946348 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerStarted","Data":"89f46e5b4a0935c6c8d0372a2e2c13bb0ac2daa3faf717667044521c14cb7980"} Apr 19 15:24:35.948262 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:35.948226 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" event={"ID":"f8d87387-5e0f-4caa-b9af-353b2855e997","Type":"ContainerStarted","Data":"ee0d0c101177cd45daa26663b2a802d57ffff30d58aea32faac6a06b27015276"} Apr 19 15:24:36.384981 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:36.384257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:36.384981 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.384540 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:36.384981 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.384559 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:36.384981 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.384623 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:24:38.384602644 +0000 UTC m=+37.184518199 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:36.485205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:36.485332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.485461 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.485520 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:24:38.485502785 +0000 UTC m=+37.285418327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.485583 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:36.485974 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:36.485615 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:38.48560384 +0000 UTC m=+37.285519394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:38.406774 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:38.406727 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:38.407217 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.406909 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:38.407217 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.406931 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:38.407217 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.407004 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:24:42.406982688 +0000 UTC m=+41.206898226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:38.507288 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:38.507250 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:38.507459 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:38.507333 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:38.507459 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.507443 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:38.507459 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.507450 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:38.507617 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.507504 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:42.507486796 +0000 UTC m=+41.307402352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:38.507617 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:38.507520 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:24:42.507511177 +0000 UTC m=+41.307426714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:42.443370 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.443326 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:42.443872 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.443487 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:42.443872 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.443511 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:42.443872 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.443597 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:24:50.443574277 +0000 UTC m=+49.243489860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:42.543980 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.543935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:42.544171 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.544053 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:42.544171 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.544067 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:42.544171 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.544132 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:24:50.544102817 +0000 UTC m=+49.344018377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:42.544359 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.544201 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:42.544359 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:42.544256 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:24:50.54424141 +0000 UTC m=+49.344156948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:42.745856 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.745775 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:42.749458 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.749434 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16c2a1f0-b512-4202-a476-b96d67bc2fcf-original-pull-secret\") pod \"global-pull-secret-syncer-p2kvw\" (UID: \"16c2a1f0-b512-4202-a476-b96d67bc2fcf\") " pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:42.879073 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:42.879037 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p2kvw" Apr 19 15:24:44.306618 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.306567 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p2kvw"] Apr 19 15:24:44.310753 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:24:44.310723 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c2a1f0_b512_4202_a476_b96d67bc2fcf.slice/crio-7153546b44f715d651f0c29809400868e23c5537b972c8b63f2b76796ed1933d WatchSource:0}: Error finding container 7153546b44f715d651f0c29809400868e23c5537b972c8b63f2b76796ed1933d: Status 404 returned error can't find the container with id 7153546b44f715d651f0c29809400868e23c5537b972c8b63f2b76796ed1933d Apr 19 15:24:44.967523 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.967436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" event={"ID":"abdd09d7-7736-4592-ac16-cc3af5cd3b0c","Type":"ContainerStarted","Data":"31148103684f3816a64ba327f0dade82d397feca9c422ca4698a86c398c9ae55"} Apr 19 15:24:44.967830 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.967644 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:44.969308 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.969269 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerStarted","Data":"17c66069b30423223c86de03dde9f3b92fc54599d4b82f07b290db4db7b5846d"} Apr 19 15:24:44.969686 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.969661 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" Apr 19 15:24:44.972011 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.971982 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="15b82c47160ee82a2aa6b8df5e894fbf258a9428bbfb4d66a5df5631eff54164" exitCode=0 Apr 19 15:24:44.972134 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.972060 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"15b82c47160ee82a2aa6b8df5e894fbf258a9428bbfb4d66a5df5631eff54164"} Apr 19 15:24:44.973355 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.973330 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" event={"ID":"f8d87387-5e0f-4caa-b9af-353b2855e997","Type":"ContainerStarted","Data":"b03a6bf2af605c130f145094ca86696a2da43f6a1c86e3a3400c49b62421e9b6"} Apr 19 15:24:44.974468 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.974448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p2kvw" event={"ID":"16c2a1f0-b512-4202-a476-b96d67bc2fcf","Type":"ContainerStarted","Data":"7153546b44f715d651f0c29809400868e23c5537b972c8b63f2b76796ed1933d"} Apr 19 15:24:44.983350 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:44.983292 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cc95cddf6-5g7rr" podStartSLOduration=26.953314348 podStartE2EDuration="35.983280118s" podCreationTimestamp="2026-04-19 15:24:09 +0000 UTC" firstStartedPulling="2026-04-19 15:24:35.14624828 +0000 UTC m=+33.946163817" lastFinishedPulling="2026-04-19 15:24:44.176214036 +0000 UTC m=+42.976129587" observedRunningTime="2026-04-19 15:24:44.983085375 +0000 UTC m=+43.783000935" watchObservedRunningTime="2026-04-19 15:24:44.983280118 +0000 UTC m=+43.783195677" Apr 19 15:24:45.017119 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:45.017077 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d686c5c7-8g4hv" podStartSLOduration=26.975003967 podStartE2EDuration="36.01706357s" podCreationTimestamp="2026-04-19 15:24:09 +0000 UTC" firstStartedPulling="2026-04-19 15:24:35.134153004 +0000 UTC m=+33.934068548" lastFinishedPulling="2026-04-19 15:24:44.176212604 +0000 UTC m=+42.976128151" observedRunningTime="2026-04-19 15:24:45.016844269 +0000 UTC m=+43.816759829" watchObservedRunningTime="2026-04-19 15:24:45.01706357 +0000 UTC m=+43.816979123" Apr 19 15:24:45.979625 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:45.979582 2564 generic.go:358] "Generic (PLEG): container finished" podID="f4487f66-c637-425c-a304-b53a5a1d6b25" containerID="18728c190ee99b932d00b7ca5e4c8ce97a7dc8ea48f66ac45d85d40b585a8142" exitCode=0 Apr 19 15:24:45.980472 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:45.980440 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerDied","Data":"18728c190ee99b932d00b7ca5e4c8ce97a7dc8ea48f66ac45d85d40b585a8142"} Apr 19 15:24:46.985240 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:46.985190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57hmj" event={"ID":"f4487f66-c637-425c-a304-b53a5a1d6b25","Type":"ContainerStarted","Data":"f7d82a6d2c4f0107bebea1025f67165496eaea22051de4392bfe6ccf45929e91"} Apr 19 15:24:47.018712 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:47.018650 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-57hmj" podStartSLOduration=6.5168940939999995 podStartE2EDuration="46.018613755s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:24:04.696746688 +0000 UTC m=+3.496662227" lastFinishedPulling="2026-04-19 15:24:44.198466344 +0000 UTC m=+42.998381888" observedRunningTime="2026-04-19 15:24:47.018443737 +0000 UTC m=+45.818359297" watchObservedRunningTime="2026-04-19 15:24:47.018613755 +0000 UTC m=+45.818529317" Apr 19 15:24:48.991674 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:48.991617 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p2kvw" event={"ID":"16c2a1f0-b512-4202-a476-b96d67bc2fcf","Type":"ContainerStarted","Data":"8e2698fdb9224b333fae0c36879901da9da09bea711ff05c3e1691f03d906bf3"} Apr 19 15:24:48.993521 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:48.993485 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerStarted","Data":"e4ca02c2f5744fd0bd98ebd1f2edca142a15d7575c7247119a79dde948e65d6b"} Apr 19 15:24:48.993521 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:48.993520 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerStarted","Data":"8f133420b24b26a014c697f841283940f50c2263145285cf41eab289140377ae"} Apr 19 15:24:49.004851 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:49.004811 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-p2kvw" podStartSLOduration=18.894689259 podStartE2EDuration="23.004798014s" podCreationTimestamp="2026-04-19 15:24:26 +0000 UTC" firstStartedPulling="2026-04-19 15:24:44.312666171 +0000 UTC m=+43.112581708" lastFinishedPulling="2026-04-19 15:24:48.422774912 +0000 UTC m=+47.222690463" observedRunningTime="2026-04-19 15:24:49.004585394 +0000 UTC m=+47.804500954" watchObservedRunningTime="2026-04-19 15:24:49.004798014 +0000 UTC m=+47.804713575" Apr 19 15:24:49.023759 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:49.023720 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" podStartSLOduration=26.742295253 podStartE2EDuration="40.023709812s" podCreationTimestamp="2026-04-19 15:24:09 +0000 UTC" firstStartedPulling="2026-04-19 15:24:35.131318458 +0000 UTC m=+33.931233995" lastFinishedPulling="2026-04-19 15:24:48.412733002 +0000 UTC m=+47.212648554" observedRunningTime="2026-04-19 15:24:49.022279344 +0000 UTC m=+47.822194938" watchObservedRunningTime="2026-04-19 15:24:49.023709812 +0000 UTC m=+47.823625362" Apr 19 15:24:50.512913 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:50.512874 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:24:50.513278 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.513013 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:24:50.513278 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.513028 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:24:50.513278 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.513084 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:06.513066692 +0000 UTC m=+65.312982232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:24:50.613472 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:50.613438 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:24:50.613624 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:50.613491 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:24:50.613624 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.613573 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:24:50.613624 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.613590 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:24:50.613624 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.613622 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:06.613609439 +0000 UTC m=+65.413524976 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:24:50.613846 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:24:50.613667 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:06.613650041 +0000 UTC m=+65.413565588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:24:58.937762 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:24:58.937734 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwk8b" Apr 19 15:25:06.530909 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:06.530858 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:25:06.531443 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.530993 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:06.531443 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.531012 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:25:06.531443 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.531086 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:38.531065789 +0000 UTC m=+97.330981328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:25:06.631688 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:06.631650 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:25:06.631849 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:06.631705 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:25:06.631849 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.631793 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:06.631955 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.631855 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:38.631841681 +0000 UTC m=+97.431757224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:25:06.631955 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.631797 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:06.631955 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:06.631945 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:38.631927748 +0000 UTC m=+97.431843302 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:25:07.438254 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.438205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:25:07.438435 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:07.438356 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 15:25:07.438435 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:07.438419 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs podName:4bc5acd5-37f1-4ca4-bb28-39e2f95194f9 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:11.438405567 +0000 UTC m=+130.238321103 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs") pod "network-metrics-daemon-6gs28" (UID: "4bc5acd5-37f1-4ca4-bb28-39e2f95194f9") : secret "metrics-daemon-secret" not found Apr 19 15:25:07.640392 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.640358 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:25:07.643488 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.643464 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 15:25:07.653719 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.653695 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 15:25:07.664264 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.664244 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cjtd\" (UniqueName: \"kubernetes.io/projected/07623157-4c19-4793-880d-b21867ce44f7-kube-api-access-2cjtd\") pod \"network-check-target-g9dx5\" (UID: \"07623157-4c19-4793-880d-b21867ce44f7\") " pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:25:07.776780 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.776752 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b2zbg\"" Apr 19 15:25:07.784923 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.784895 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:25:07.895168 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:07.895139 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g9dx5"] Apr 19 15:25:07.898989 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:25:07.898965 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07623157_4c19_4793_880d_b21867ce44f7.slice/crio-43778549e0bc6b79cb931b9a30d08886aa5e8dc815d822b37298e641080ca5e0 WatchSource:0}: Error finding container 43778549e0bc6b79cb931b9a30d08886aa5e8dc815d822b37298e641080ca5e0: Status 404 returned error can't find the container with id 43778549e0bc6b79cb931b9a30d08886aa5e8dc815d822b37298e641080ca5e0 Apr 19 15:25:08.038472 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:08.038386 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g9dx5" event={"ID":"07623157-4c19-4793-880d-b21867ce44f7","Type":"ContainerStarted","Data":"43778549e0bc6b79cb931b9a30d08886aa5e8dc815d822b37298e641080ca5e0"} Apr 19 15:25:12.049772 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:12.049738 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g9dx5" event={"ID":"07623157-4c19-4793-880d-b21867ce44f7","Type":"ContainerStarted","Data":"feb7fe202708565aa6344d704234015ac8aeed41ea957b2d3977656ba8208a41"} Apr 19 15:25:12.050124 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:12.049868 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:25:12.064221 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:12.064176 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-g9dx5" podStartSLOduration=67.869958342 podStartE2EDuration="1m11.064166067s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:07.900853308 +0000 UTC m=+66.700768845" lastFinishedPulling="2026-04-19 15:25:11.095061018 +0000 UTC m=+69.894976570" observedRunningTime="2026-04-19 15:25:12.06378319 +0000 UTC m=+70.863698759" watchObservedRunningTime="2026-04-19 15:25:12.064166067 +0000 UTC m=+70.864081626" Apr 19 15:25:38.572575 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:38.572519 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:25:38.573025 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.572691 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:38.573025 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.572712 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c7fffcdf6-4s9gn: secret "image-registry-tls" not found Apr 19 15:25:38.573025 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.572780 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls podName:53349c81-aa87-40ff-92e5-714108c8cefc nodeName:}" failed. No retries permitted until 2026-04-19 15:26:42.572763266 +0000 UTC m=+161.372678803 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls") pod "image-registry-6c7fffcdf6-4s9gn" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc") : secret "image-registry-tls" not found Apr 19 15:25:38.673524 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:38.673473 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:25:38.673684 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:38.673545 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:25:38.673684 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.673617 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:38.673684 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.673664 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:38.673782 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.673710 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert podName:2f59fe9a-35b7-4dae-ae52-22cef5a86c77 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:42.673698118 +0000 UTC m=+161.473613654 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert") pod "ingress-canary-lnp5x" (UID: "2f59fe9a-35b7-4dae-ae52-22cef5a86c77") : secret "canary-serving-cert" not found Apr 19 15:25:38.673782 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:25:38.673722 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls podName:7577782b-d81a-428d-abba-4b2f85606b5a nodeName:}" failed. No retries permitted until 2026-04-19 15:26:42.673716821 +0000 UTC m=+161.473632357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls") pod "dns-default-frphc" (UID: "7577782b-d81a-428d-abba-4b2f85606b5a") : secret "dns-default-metrics-tls" not found Apr 19 15:25:43.054908 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:43.054873 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-g9dx5" Apr 19 15:25:55.837668 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.837594 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w"] Apr 19 15:25:55.840512 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.840495 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" Apr 19 15:25:55.843107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.843086 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-bkz9s\"" Apr 19 15:25:55.846286 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.846265 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w"] Apr 19 15:25:55.885353 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.885323 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z45qc\" (UniqueName: \"kubernetes.io/projected/f3911b93-0872-4188-b092-01a8b4b8326d-kube-api-access-z45qc\") pod \"network-check-source-8894fc9bd-f7z6w\" (UID: \"f3911b93-0872-4188-b092-01a8b4b8326d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" Apr 19 15:25:55.986683 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.986654 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z45qc\" (UniqueName: \"kubernetes.io/projected/f3911b93-0872-4188-b092-01a8b4b8326d-kube-api-access-z45qc\") pod \"network-check-source-8894fc9bd-f7z6w\" (UID: \"f3911b93-0872-4188-b092-01a8b4b8326d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" Apr 19 15:25:55.994869 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:55.994844 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z45qc\" (UniqueName: \"kubernetes.io/projected/f3911b93-0872-4188-b092-01a8b4b8326d-kube-api-access-z45qc\") pod \"network-check-source-8894fc9bd-f7z6w\" (UID: \"f3911b93-0872-4188-b092-01a8b4b8326d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" Apr 19 15:25:56.150089 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:56.150011 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" Apr 19 15:25:56.261685 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:56.261655 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w"] Apr 19 15:25:56.264477 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:25:56.264449 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3911b93_0872_4188_b092_01a8b4b8326d.slice/crio-f2e423527dc98337b18a7f3ca77ee67234583b4e00c4c504cbecea6e4207b3bb WatchSource:0}: Error finding container f2e423527dc98337b18a7f3ca77ee67234583b4e00c4c504cbecea6e4207b3bb: Status 404 returned error can't find the container with id f2e423527dc98337b18a7f3ca77ee67234583b4e00c4c504cbecea6e4207b3bb Apr 19 15:25:57.152368 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:57.152326 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" event={"ID":"f3911b93-0872-4188-b092-01a8b4b8326d","Type":"ContainerStarted","Data":"34cc1bb62a561d79d7c1865e8fb1f648b54789ba1939f4af4f60a7758e32283c"} Apr 19 15:25:57.152368 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:57.152369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" event={"ID":"f3911b93-0872-4188-b092-01a8b4b8326d","Type":"ContainerStarted","Data":"f2e423527dc98337b18a7f3ca77ee67234583b4e00c4c504cbecea6e4207b3bb"} Apr 19 15:25:57.167030 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:57.166992 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7z6w" podStartSLOduration=2.166976946 podStartE2EDuration="2.166976946s" podCreationTimestamp="2026-04-19 15:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:25:57.166606009 +0000 UTC m=+115.966521582" watchObservedRunningTime="2026-04-19 15:25:57.166976946 +0000 UTC m=+115.966892504" Apr 19 15:25:59.754975 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:25:59.754943 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9wldk_e251a3a4-2359-4dcf-90f8-20b9a43b9aa4/dns-node-resolver/0.log" Apr 19 15:26:00.955518 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:00.955487 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d6jcn_e1bc6add-6e54-4e69-ae6d-573e33e1dc7a/node-ca/0.log" Apr 19 15:26:11.509555 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:11.509518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:26:11.511726 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:11.511696 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bc5acd5-37f1-4ca4-bb28-39e2f95194f9-metrics-certs\") pod \"network-metrics-daemon-6gs28\" (UID: \"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9\") " pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:26:11.607723 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:11.607689 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjn96\"" Apr 19 15:26:11.615926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:11.615903 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6gs28" Apr 19 15:26:11.731180 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:11.731149 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6gs28"] Apr 19 15:26:11.733810 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:11.733778 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc5acd5_37f1_4ca4_bb28_39e2f95194f9.slice/crio-7e66d122dcda866ae3502d1feb2afc598ed96e0113aed0af1a78be4c636cb18c WatchSource:0}: Error finding container 7e66d122dcda866ae3502d1feb2afc598ed96e0113aed0af1a78be4c636cb18c: Status 404 returned error can't find the container with id 7e66d122dcda866ae3502d1feb2afc598ed96e0113aed0af1a78be4c636cb18c Apr 19 15:26:12.190201 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:12.190157 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6gs28" event={"ID":"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9","Type":"ContainerStarted","Data":"7e66d122dcda866ae3502d1feb2afc598ed96e0113aed0af1a78be4c636cb18c"} Apr 19 15:26:13.194120 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:13.194041 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6gs28" event={"ID":"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9","Type":"ContainerStarted","Data":"634b91ce0860e6bf2c40f14aafb3ebb0750af212f8d8e6109f7fa8eb0dea0371"} Apr 19 15:26:13.194120 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:13.194075 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6gs28" event={"ID":"4bc5acd5-37f1-4ca4-bb28-39e2f95194f9","Type":"ContainerStarted","Data":"fc259fbe251cfd2b3b2c9242ce3e7281c799175fcb32af4ca72692ed1a3ef9ce"} Apr 19 15:26:13.211273 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:13.211233 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6gs28" podStartSLOduration=131.002416395 podStartE2EDuration="2m12.211219299s" podCreationTimestamp="2026-04-19 15:24:01 +0000 UTC" firstStartedPulling="2026-04-19 15:26:11.735611584 +0000 UTC m=+130.535527121" lastFinishedPulling="2026-04-19 15:26:12.944414487 +0000 UTC m=+131.744330025" observedRunningTime="2026-04-19 15:26:13.209886515 +0000 UTC m=+132.009802075" watchObservedRunningTime="2026-04-19 15:26:13.211219299 +0000 UTC m=+132.011134858" Apr 19 15:26:22.448699 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.448667 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9b56l"] Apr 19 15:26:22.451746 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.451728 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.454380 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.454357 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jvwp2\"" Apr 19 15:26:22.454511 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.454466 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 15:26:22.456596 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.456567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 15:26:22.456733 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.456567 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 15:26:22.456733 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.456646 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 15:26:22.462531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.462512 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9b56l"] Apr 19 15:26:22.590785 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.590752 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02e21f7d-06af-4ca8-b988-b015f529dbf7-crio-socket\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.590785 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.590785 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.591028 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.590816 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8nh7\" (UniqueName: \"kubernetes.io/projected/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-api-access-m8nh7\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.591028 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.590934 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02e21f7d-06af-4ca8-b988-b015f529dbf7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.591028 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.590973 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02e21f7d-06af-4ca8-b988-b015f529dbf7-data-volume\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.691849 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.691813 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.691987 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.691866 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8nh7\" (UniqueName: \"kubernetes.io/projected/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-api-access-m8nh7\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.691987 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.691908 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02e21f7d-06af-4ca8-b988-b015f529dbf7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.691987 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.691939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02e21f7d-06af-4ca8-b988-b015f529dbf7-data-volume\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.692081 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.692035 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02e21f7d-06af-4ca8-b988-b015f529dbf7-crio-socket\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.692135 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.692122 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02e21f7d-06af-4ca8-b988-b015f529dbf7-crio-socket\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.692300 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.692269 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02e21f7d-06af-4ca8-b988-b015f529dbf7-data-volume\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.692455 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.692441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.694335 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.694315 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02e21f7d-06af-4ca8-b988-b015f529dbf7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.701444 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.701376 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8nh7\" (UniqueName: \"kubernetes.io/projected/02e21f7d-06af-4ca8-b988-b015f529dbf7-kube-api-access-m8nh7\") pod \"insights-runtime-extractor-9b56l\" (UID: \"02e21f7d-06af-4ca8-b988-b015f529dbf7\") " pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.760992 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.760962 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9b56l" Apr 19 15:26:22.873883 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:22.873836 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9b56l"] Apr 19 15:26:22.878182 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:22.878143 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e21f7d_06af_4ca8_b988_b015f529dbf7.slice/crio-6ddedbef427d030e4b95ee10efce0ce24bf8cfa8ed98b0818fa10ef3567d2093 WatchSource:0}: Error finding container 6ddedbef427d030e4b95ee10efce0ce24bf8cfa8ed98b0818fa10ef3567d2093: Status 404 returned error can't find the container with id 6ddedbef427d030e4b95ee10efce0ce24bf8cfa8ed98b0818fa10ef3567d2093 Apr 19 15:26:23.222128 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:23.222091 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9b56l" event={"ID":"02e21f7d-06af-4ca8-b988-b015f529dbf7","Type":"ContainerStarted","Data":"2df72aa376cf127fdb18ca3596271aaa7bd4f9f270000ba60419c83f725b5856"} Apr 19 15:26:23.222128 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:23.222129 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9b56l" event={"ID":"02e21f7d-06af-4ca8-b988-b015f529dbf7","Type":"ContainerStarted","Data":"6ddedbef427d030e4b95ee10efce0ce24bf8cfa8ed98b0818fa10ef3567d2093"} Apr 19 15:26:24.226219 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:24.226184 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9b56l" event={"ID":"02e21f7d-06af-4ca8-b988-b015f529dbf7","Type":"ContainerStarted","Data":"6d2795d4e355fb5c20bd688f710d6e2b631046d6b87df028272074e9aac8c54b"} Apr 19 15:26:26.234753 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:26.234714 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9b56l" event={"ID":"02e21f7d-06af-4ca8-b988-b015f529dbf7","Type":"ContainerStarted","Data":"3ffc4cc28ed0499b4e9cc8467e35837e15277aa377eaf4b2f5d8fb705057320c"} Apr 19 15:26:26.250392 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:26.250346 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9b56l" podStartSLOduration=1.697207468 podStartE2EDuration="4.250332482s" podCreationTimestamp="2026-04-19 15:26:22 +0000 UTC" firstStartedPulling="2026-04-19 15:26:22.931403548 +0000 UTC m=+141.731319093" lastFinishedPulling="2026-04-19 15:26:25.484528557 +0000 UTC m=+144.284444107" observedRunningTime="2026-04-19 15:26:26.248940485 +0000 UTC m=+145.048856070" watchObservedRunningTime="2026-04-19 15:26:26.250332482 +0000 UTC m=+145.050248041" Apr 19 15:26:34.188791 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.188758 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qcbcj"] Apr 19 15:26:34.192162 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.192138 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.195135 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.195112 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 15:26:34.195245 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.195229 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 15:26:34.195439 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.195427 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 15:26:34.195588 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.195568 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 15:26:34.196474 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.196455 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s2772\"" Apr 19 15:26:34.196567 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.196547 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 15:26:34.196626 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.196549 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 15:26:34.284733 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284696 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-wtmp\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.284733 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284731 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-metrics-client-ca\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.284926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284759 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-accelerators-collector-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.284926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284833 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.284926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284860 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.284926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284905 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-sys\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.285059 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.284943 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-textfile\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.285059 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.285000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-root\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.285059 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.285027 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25fv\" (UniqueName: \"kubernetes.io/projected/f31b0631-e29f-40f1-9099-bc910d8074ad-kube-api-access-w25fv\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385745 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385710 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-wtmp\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385745 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-metrics-client-ca\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-accelerators-collector-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385883 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385889 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-wtmp\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385944 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-sys\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.385979 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.385974 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-textfile\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:26:34.386026 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386059 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-sys\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-root\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386064 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f31b0631-e29f-40f1-9099-bc910d8074ad-root\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:26:34.386108 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls podName:f31b0631-e29f-40f1-9099-bc910d8074ad nodeName:}" failed. No retries permitted until 2026-04-19 15:26:34.886087895 +0000 UTC m=+153.686003445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls") pod "node-exporter-qcbcj" (UID: "f31b0631-e29f-40f1-9099-bc910d8074ad") : secret "node-exporter-tls" not found Apr 19 15:26:34.386278 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386152 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w25fv\" (UniqueName: \"kubernetes.io/projected/f31b0631-e29f-40f1-9099-bc910d8074ad-kube-api-access-w25fv\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386581 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386313 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-textfile\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386581 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386415 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-metrics-client-ca\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.386581 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.386416 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-accelerators-collector-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.388108 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.388086 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.394565 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.394536 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25fv\" (UniqueName: \"kubernetes.io/projected/f31b0631-e29f-40f1-9099-bc910d8074ad-kube-api-access-w25fv\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.890408 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.890379 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:34.892695 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:34.892664 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f31b0631-e29f-40f1-9099-bc910d8074ad-node-exporter-tls\") pod \"node-exporter-qcbcj\" (UID: \"f31b0631-e29f-40f1-9099-bc910d8074ad\") " pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:35.101530 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:35.101494 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qcbcj" Apr 19 15:26:35.109287 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:35.109258 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31b0631_e29f_40f1_9099_bc910d8074ad.slice/crio-65cfc1bd73e6408b872dfee6fddf2e4c5e9d0c4cf8d2ccb56398352b4afdd3a1 WatchSource:0}: Error finding container 65cfc1bd73e6408b872dfee6fddf2e4c5e9d0c4cf8d2ccb56398352b4afdd3a1: Status 404 returned error can't find the container with id 65cfc1bd73e6408b872dfee6fddf2e4c5e9d0c4cf8d2ccb56398352b4afdd3a1 Apr 19 15:26:35.255963 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:35.255932 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qcbcj" event={"ID":"f31b0631-e29f-40f1-9099-bc910d8074ad","Type":"ContainerStarted","Data":"65cfc1bd73e6408b872dfee6fddf2e4c5e9d0c4cf8d2ccb56398352b4afdd3a1"} Apr 19 15:26:36.259384 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:36.259351 2564 generic.go:358] "Generic (PLEG): container finished" podID="f31b0631-e29f-40f1-9099-bc910d8074ad" containerID="fab62a146f33dd7296377f61992c251269be8115bf16074c4287fca8957711ab" exitCode=0 Apr 19 15:26:36.259770 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:36.259400 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qcbcj" event={"ID":"f31b0631-e29f-40f1-9099-bc910d8074ad","Type":"ContainerDied","Data":"fab62a146f33dd7296377f61992c251269be8115bf16074c4287fca8957711ab"} Apr 19 15:26:37.263820 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:37.263790 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qcbcj" event={"ID":"f31b0631-e29f-40f1-9099-bc910d8074ad","Type":"ContainerStarted","Data":"4a42a9747fdc68237caa31ad5ff68ac7e01f0a03a77b43d615f1320da4fbfd54"} Apr 19 15:26:37.263820 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:37.263826 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qcbcj" event={"ID":"f31b0631-e29f-40f1-9099-bc910d8074ad","Type":"ContainerStarted","Data":"4db6c20ce63fff5a27df1866703ba44f7ee052a8f07aadf71bec5404c2651f55"} Apr 19 15:26:37.286329 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:37.286287 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qcbcj" podStartSLOduration=2.415615466 podStartE2EDuration="3.286274058s" podCreationTimestamp="2026-04-19 15:26:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:35.111335542 +0000 UTC m=+153.911251093" lastFinishedPulling="2026-04-19 15:26:35.981993943 +0000 UTC m=+154.781909685" observedRunningTime="2026-04-19 15:26:37.284681578 +0000 UTC m=+156.084597141" watchObservedRunningTime="2026-04-19 15:26:37.286274058 +0000 UTC m=+156.086189616" Apr 19 15:26:37.676993 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:26:37.676906 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" Apr 19 15:26:37.699509 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:26:37.699482 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lnp5x" podUID="2f59fe9a-35b7-4dae-ae52-22cef5a86c77" Apr 19 15:26:37.726768 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:26:37.726745 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-frphc" podUID="7577782b-d81a-428d-abba-4b2f85606b5a" Apr 19 15:26:38.266562 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:38.266526 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:26:38.267029 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:38.266526 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:26:38.267029 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:38.266538 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frphc" Apr 19 15:26:42.651804 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.651769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:26:42.654242 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.654216 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"image-registry-6c7fffcdf6-4s9gn\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:26:42.752670 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.752614 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:26:42.752827 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.752692 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:26:42.754850 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.754821 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7577782b-d81a-428d-abba-4b2f85606b5a-metrics-tls\") pod \"dns-default-frphc\" (UID: \"7577782b-d81a-428d-abba-4b2f85606b5a\") " pod="openshift-dns/dns-default-frphc" Apr 19 15:26:42.755079 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.755057 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f59fe9a-35b7-4dae-ae52-22cef5a86c77-cert\") pod \"ingress-canary-lnp5x\" (UID: \"2f59fe9a-35b7-4dae-ae52-22cef5a86c77\") " pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:26:42.771087 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.771065 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zxl4j\"" Apr 19 15:26:42.771208 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.771067 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvb5z\"" Apr 19 15:26:42.771208 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.771118 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b94dq\"" Apr 19 15:26:42.778333 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.778316 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frphc" Apr 19 15:26:42.778427 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.778411 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:26:42.778491 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.778474 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lnp5x" Apr 19 15:26:42.918123 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:42.918100 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-frphc"] Apr 19 15:26:42.919243 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:42.919216 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7577782b_d81a_428d_abba_4b2f85606b5a.slice/crio-1802a2a3e46c1787e6f264f09ff498819e1320a3fec4b41926289825cd098bc0 WatchSource:0}: Error finding container 1802a2a3e46c1787e6f264f09ff498819e1320a3fec4b41926289825cd098bc0: Status 404 returned error can't find the container with id 1802a2a3e46c1787e6f264f09ff498819e1320a3fec4b41926289825cd098bc0 Apr 19 15:26:43.142744 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.142712 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lnp5x"] Apr 19 15:26:43.145699 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:43.145668 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f59fe9a_35b7_4dae_ae52_22cef5a86c77.slice/crio-1f07a4a0430d0b8c503f563584581d1e9683d89090b602e1b576987f90c2827a WatchSource:0}: Error finding container 1f07a4a0430d0b8c503f563584581d1e9683d89090b602e1b576987f90c2827a: Status 404 returned error can't find the container with id 1f07a4a0430d0b8c503f563584581d1e9683d89090b602e1b576987f90c2827a Apr 19 15:26:43.146845 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.146816 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:26:43.148690 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:26:43.148670 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53349c81_aa87_40ff_92e5_714108c8cefc.slice/crio-3746a6f649885d3b7dd9d922b2fec95e2d180d6adf224c9bec6143b95498a481 WatchSource:0}: Error finding container 3746a6f649885d3b7dd9d922b2fec95e2d180d6adf224c9bec6143b95498a481: Status 404 returned error can't find the container with id 3746a6f649885d3b7dd9d922b2fec95e2d180d6adf224c9bec6143b95498a481 Apr 19 15:26:43.283624 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.283587 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" event={"ID":"53349c81-aa87-40ff-92e5-714108c8cefc","Type":"ContainerStarted","Data":"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30"} Apr 19 15:26:43.283812 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.283654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" event={"ID":"53349c81-aa87-40ff-92e5-714108c8cefc","Type":"ContainerStarted","Data":"3746a6f649885d3b7dd9d922b2fec95e2d180d6adf224c9bec6143b95498a481"} Apr 19 15:26:43.283812 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.283676 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:26:43.284692 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.284662 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lnp5x" event={"ID":"2f59fe9a-35b7-4dae-ae52-22cef5a86c77","Type":"ContainerStarted","Data":"1f07a4a0430d0b8c503f563584581d1e9683d89090b602e1b576987f90c2827a"} Apr 19 15:26:43.285656 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.285613 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frphc" event={"ID":"7577782b-d81a-428d-abba-4b2f85606b5a","Type":"ContainerStarted","Data":"1802a2a3e46c1787e6f264f09ff498819e1320a3fec4b41926289825cd098bc0"} Apr 19 15:26:43.302403 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:43.302355 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" podStartSLOduration=161.302337402 podStartE2EDuration="2m41.302337402s" podCreationTimestamp="2026-04-19 15:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:26:43.300679213 +0000 UTC m=+162.100594773" watchObservedRunningTime="2026-04-19 15:26:43.302337402 +0000 UTC m=+162.102252962" Apr 19 15:26:44.971851 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:44.971820 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:26:45.293940 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.293908 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frphc" event={"ID":"7577782b-d81a-428d-abba-4b2f85606b5a","Type":"ContainerStarted","Data":"f4f2debd971c8d43429a56bc40fb31613163fa08ae2192d00a300bb723fbc3ef"} Apr 19 15:26:45.293940 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.293942 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frphc" event={"ID":"7577782b-d81a-428d-abba-4b2f85606b5a","Type":"ContainerStarted","Data":"d7836c460fdd0945a3816942a2f8bc76a6f8f0df5e7b4a8522ff4b2827fc2428"} Apr 19 15:26:45.294146 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.294015 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-frphc" Apr 19 15:26:45.295252 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.295228 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lnp5x" event={"ID":"2f59fe9a-35b7-4dae-ae52-22cef5a86c77","Type":"ContainerStarted","Data":"0ab7e828ebe5c20efed3afb64ed287b5971a3223b1e74cdbcd64be3e45f8a9f4"} Apr 19 15:26:45.310839 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.310795 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-frphc" podStartSLOduration=129.217500364 podStartE2EDuration="2m11.310783608s" podCreationTimestamp="2026-04-19 15:24:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:42.921166235 +0000 UTC m=+161.721081771" lastFinishedPulling="2026-04-19 15:26:45.014449479 +0000 UTC m=+163.814365015" observedRunningTime="2026-04-19 15:26:45.30937406 +0000 UTC m=+164.109289665" watchObservedRunningTime="2026-04-19 15:26:45.310783608 +0000 UTC m=+164.110699166" Apr 19 15:26:45.326404 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:45.326366 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lnp5x" podStartSLOduration=129.456290407 podStartE2EDuration="2m11.326353187s" podCreationTimestamp="2026-04-19 15:24:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:43.14771605 +0000 UTC m=+161.947631588" lastFinishedPulling="2026-04-19 15:26:45.01777883 +0000 UTC m=+163.817694368" observedRunningTime="2026-04-19 15:26:45.325983116 +0000 UTC m=+164.125898687" watchObservedRunningTime="2026-04-19 15:26:45.326353187 +0000 UTC m=+164.126268746" Apr 19 15:26:55.300312 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:26:55.300242 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-frphc" Apr 19 15:27:05.300587 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:05.300559 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:27:08.894968 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:08.894940 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lnp5x_2f59fe9a-35b7-4dae-ae52-22cef5a86c77/serve-healthcheck-canary/0.log" Apr 19 15:27:10.314080 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.314015 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" containerName="registry" containerID="cri-o://c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30" gracePeriod=30 Apr 19 15:27:10.548761 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.548740 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:27:10.646347 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646274 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646347 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646309 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646364 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646382 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8w5\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646417 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646531 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646511 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646766 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646557 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646766 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646594 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets\") pod \"53349c81-aa87-40ff-92e5-714108c8cefc\" (UID: \"53349c81-aa87-40ff-92e5-714108c8cefc\") " Apr 19 15:27:10.646918 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646892 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:27:10.647019 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.646993 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:27:10.648916 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.648893 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:27:10.649019 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.648966 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:27:10.649063 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.649013 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5" (OuterVolumeSpecName: "kube-api-access-wd8w5") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "kube-api-access-wd8w5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:27:10.649121 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.649098 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:27:10.650361 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.650344 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:27:10.656092 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.656069 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "53349c81-aa87-40ff-92e5-714108c8cefc" (UID: "53349c81-aa87-40ff-92e5-714108c8cefc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:27:10.747972 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.747945 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53349c81-aa87-40ff-92e5-714108c8cefc-ca-trust-extracted\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.747972 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.747969 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wd8w5\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-kube-api-access-wd8w5\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.747978 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-registry-certificates\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.747988 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-registry-tls\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.747997 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53349c81-aa87-40ff-92e5-714108c8cefc-trusted-ca\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.748005 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-installation-pull-secrets\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.748014 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53349c81-aa87-40ff-92e5-714108c8cefc-image-registry-private-configuration\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:10.748112 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:10.748023 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53349c81-aa87-40ff-92e5-714108c8cefc-bound-sa-token\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:27:11.361481 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.361448 2564 generic.go:358] "Generic (PLEG): container finished" podID="53349c81-aa87-40ff-92e5-714108c8cefc" containerID="c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30" exitCode=0 Apr 19 15:27:11.361905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.361515 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" Apr 19 15:27:11.361905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.361533 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" event={"ID":"53349c81-aa87-40ff-92e5-714108c8cefc","Type":"ContainerDied","Data":"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30"} Apr 19 15:27:11.361905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.361572 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c7fffcdf6-4s9gn" event={"ID":"53349c81-aa87-40ff-92e5-714108c8cefc","Type":"ContainerDied","Data":"3746a6f649885d3b7dd9d922b2fec95e2d180d6adf224c9bec6143b95498a481"} Apr 19 15:27:11.361905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.361593 2564 scope.go:117] "RemoveContainer" containerID="c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30" Apr 19 15:27:11.369358 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.369340 2564 scope.go:117] "RemoveContainer" containerID="c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30" Apr 19 15:27:11.369625 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:27:11.369591 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30\": container with ID starting with c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30 not found: ID does not exist" containerID="c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30" Apr 19 15:27:11.369730 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.369651 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30"} err="failed to get container status \"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30\": rpc error: code = NotFound desc = could not find container \"c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30\": container with ID starting with c69bb01206cac8750b494f25a66ff633fec60b43645619ccd0b127dfb379bf30 not found: ID does not exist" Apr 19 15:27:11.381472 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.381449 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:27:11.387756 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.387729 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c7fffcdf6-4s9gn"] Apr 19 15:27:11.666151 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:11.666077 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" path="/var/lib/kubelet/pods/53349c81-aa87-40ff-92e5-714108c8cefc/volumes" Apr 19 15:27:34.934411 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:34.934354 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" podUID="d771e17e-8f69-4e6b-b939-24cade594a96" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 15:27:44.934812 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:44.934777 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" podUID="d771e17e-8f69-4e6b-b939-24cade594a96" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 15:27:54.934404 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:54.934361 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" podUID="d771e17e-8f69-4e6b-b939-24cade594a96" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 15:27:54.934864 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:54.934435 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" Apr 19 15:27:54.934926 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:54.934907 2564 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e4ca02c2f5744fd0bd98ebd1f2edca142a15d7575c7247119a79dde948e65d6b"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 19 15:27:54.934974 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:54.934946 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" podUID="d771e17e-8f69-4e6b-b939-24cade594a96" containerName="service-proxy" containerID="cri-o://e4ca02c2f5744fd0bd98ebd1f2edca142a15d7575c7247119a79dde948e65d6b" gracePeriod=30 Apr 19 15:27:55.477902 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:55.477870 2564 generic.go:358] "Generic (PLEG): container finished" podID="d771e17e-8f69-4e6b-b939-24cade594a96" containerID="e4ca02c2f5744fd0bd98ebd1f2edca142a15d7575c7247119a79dde948e65d6b" exitCode=2 Apr 19 15:27:55.478065 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:55.477917 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerDied","Data":"e4ca02c2f5744fd0bd98ebd1f2edca142a15d7575c7247119a79dde948e65d6b"} Apr 19 15:27:55.478065 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:27:55.477949 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d7dbdfc8c-2kzq6" event={"ID":"d771e17e-8f69-4e6b-b939-24cade594a96","Type":"ContainerStarted","Data":"d9f48c2439bdac1e795ca8e581d5b76dacf20168e13e5d211b8ca9fea726dacd"} Apr 19 15:29:01.613748 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:01.613716 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 15:29:30.545506 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.545469 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bnhxp"] Apr 19 15:29:30.547747 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.545729 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" containerName="registry" Apr 19 15:29:30.547747 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.545741 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" containerName="registry" Apr 19 15:29:30.547747 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.545779 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="53349c81-aa87-40ff-92e5-714108c8cefc" containerName="registry" Apr 19 15:29:30.548608 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.548593 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.551100 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.551077 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 15:29:30.551230 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.551165 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 15:29:30.552277 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.552262 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-fx6l2\"" Apr 19 15:29:30.554385 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.554364 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bnhxp"] Apr 19 15:29:30.595117 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.595092 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brspb\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-kube-api-access-brspb\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.595244 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.595125 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.695792 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.695757 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brspb\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-kube-api-access-brspb\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.695792 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.695804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.703071 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.703041 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.703194 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.703109 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brspb\" (UniqueName: \"kubernetes.io/projected/0f0df24a-086f-4033-b10a-02f1fd1ac299-kube-api-access-brspb\") pod \"cert-manager-cainjector-8966b78d4-bnhxp\" (UID: \"0f0df24a-086f-4033-b10a-02f1fd1ac299\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.858148 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.858064 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" Apr 19 15:29:30.971459 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.971428 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bnhxp"] Apr 19 15:29:30.974671 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:29:30.974643 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0df24a_086f_4033_b10a_02f1fd1ac299.slice/crio-7e2cb883ffcb16ed6bc74244ccdd53ccd30f88e0e157dd71b4857913cb03f51f WatchSource:0}: Error finding container 7e2cb883ffcb16ed6bc74244ccdd53ccd30f88e0e157dd71b4857913cb03f51f: Status 404 returned error can't find the container with id 7e2cb883ffcb16ed6bc74244ccdd53ccd30f88e0e157dd71b4857913cb03f51f Apr 19 15:29:30.976875 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:30.976856 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:29:31.722837 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:31.722806 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" event={"ID":"0f0df24a-086f-4033-b10a-02f1fd1ac299","Type":"ContainerStarted","Data":"7e2cb883ffcb16ed6bc74244ccdd53ccd30f88e0e157dd71b4857913cb03f51f"} Apr 19 15:29:34.732760 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:34.732717 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" event={"ID":"0f0df24a-086f-4033-b10a-02f1fd1ac299","Type":"ContainerStarted","Data":"971efec62edcbbae6ccebacf9fee9ea890a386951e81737e89ecfdd0edf67456"} Apr 19 15:29:34.747299 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:34.747253 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-bnhxp" podStartSLOduration=1.219694462 podStartE2EDuration="4.747237158s" podCreationTimestamp="2026-04-19 15:29:30 +0000 UTC" firstStartedPulling="2026-04-19 15:29:30.976993071 +0000 UTC m=+329.776908608" lastFinishedPulling="2026-04-19 15:29:34.504535759 +0000 UTC m=+333.304451304" observedRunningTime="2026-04-19 15:29:34.746952567 +0000 UTC m=+333.546868128" watchObservedRunningTime="2026-04-19 15:29:34.747237158 +0000 UTC m=+333.547152719" Apr 19 15:29:54.787835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.787758 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2"] Apr 19 15:29:54.790853 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.790830 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.794800 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.794780 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 15:29:54.796403 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.796374 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 15:29:54.797139 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.797121 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 15:29:54.798236 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.798219 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 15:29:54.803189 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.798762 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pcf6s\"" Apr 19 15:29:54.806608 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.806584 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2"] Apr 19 15:29:54.858331 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.858299 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2wg\" (UniqueName: \"kubernetes.io/projected/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-kube-api-access-sv2wg\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.858509 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.858350 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.858509 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.858443 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.959348 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.959312 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.959511 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.959369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2wg\" (UniqueName: \"kubernetes.io/projected/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-kube-api-access-sv2wg\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.959511 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.959411 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.961868 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.961836 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.961989 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.961966 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:54.971184 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:54.971157 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2wg\" (UniqueName: \"kubernetes.io/projected/5e7a403b-d82f-4088-93a8-a4cc60d4be4a-kube-api-access-sv2wg\") pod \"opendatahub-operator-controller-manager-67944f454b-h7kh2\" (UID: \"5e7a403b-d82f-4088-93a8-a4cc60d4be4a\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:55.101475 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:55.101404 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:55.224604 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:55.224565 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2"] Apr 19 15:29:55.228277 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:29:55.228239 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e7a403b_d82f_4088_93a8_a4cc60d4be4a.slice/crio-5c20de2b96259b8a2908103cfff939821e56afcae37f03b249a252401dc2d67c WatchSource:0}: Error finding container 5c20de2b96259b8a2908103cfff939821e56afcae37f03b249a252401dc2d67c: Status 404 returned error can't find the container with id 5c20de2b96259b8a2908103cfff939821e56afcae37f03b249a252401dc2d67c Apr 19 15:29:55.786101 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:55.786067 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" event={"ID":"5e7a403b-d82f-4088-93a8-a4cc60d4be4a","Type":"ContainerStarted","Data":"5c20de2b96259b8a2908103cfff939821e56afcae37f03b249a252401dc2d67c"} Apr 19 15:29:58.795143 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:58.795110 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" event={"ID":"5e7a403b-d82f-4088-93a8-a4cc60d4be4a","Type":"ContainerStarted","Data":"facc4078f01865f33a25282f5a4e43a98b7f5e4da7d26dc48b39685c8a3d4dc9"} Apr 19 15:29:58.795557 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:58.795266 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:29:58.816850 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:29:58.816801 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" podStartSLOduration=2.125438432 podStartE2EDuration="4.816786945s" podCreationTimestamp="2026-04-19 15:29:54 +0000 UTC" firstStartedPulling="2026-04-19 15:29:55.229859273 +0000 UTC m=+354.029774810" lastFinishedPulling="2026-04-19 15:29:57.921207782 +0000 UTC m=+356.721123323" observedRunningTime="2026-04-19 15:29:58.815030104 +0000 UTC m=+357.614945663" watchObservedRunningTime="2026-04-19 15:29:58.816786945 +0000 UTC m=+357.616702503" Apr 19 15:30:09.799833 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:09.799804 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-h7kh2" Apr 19 15:30:15.910502 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:15.910462 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9vd7p"] Apr 19 15:30:15.917760 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:15.917738 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:15.920053 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:15.920025 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9vd7p"] Apr 19 15:30:15.920383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:15.920357 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-fzk9d\"" Apr 19 15:30:15.920499 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:15.920356 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 19 15:30:16.013737 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.013706 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.013918 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.013778 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xnzc\" (UniqueName: \"kubernetes.io/projected/4395055e-f594-437e-8773-9fc927742701-kube-api-access-8xnzc\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.114383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.114348 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xnzc\" (UniqueName: \"kubernetes.io/projected/4395055e-f594-437e-8773-9fc927742701-kube-api-access-8xnzc\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.114546 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.114390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.114546 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:30:16.114500 2564 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 19 15:30:16.114621 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:30:16.114578 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert podName:4395055e-f594-437e-8773-9fc927742701 nodeName:}" failed. No retries permitted until 2026-04-19 15:30:16.614555423 +0000 UTC m=+375.414470961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert") pod "odh-model-controller-858dbf95b8-9vd7p" (UID: "4395055e-f594-437e-8773-9fc927742701") : secret "odh-model-controller-webhook-cert" not found Apr 19 15:30:16.123608 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.123575 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xnzc\" (UniqueName: \"kubernetes.io/projected/4395055e-f594-437e-8773-9fc927742701-kube-api-access-8xnzc\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.618457 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.618419 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.620886 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.620856 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4395055e-f594-437e-8773-9fc927742701-cert\") pod \"odh-model-controller-858dbf95b8-9vd7p\" (UID: \"4395055e-f594-437e-8773-9fc927742701\") " pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.828730 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.828691 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:16.952738 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:16.952716 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9vd7p"] Apr 19 15:30:16.955348 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:30:16.955320 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4395055e_f594_437e_8773_9fc927742701.slice/crio-a1cd8d0ff9c9c6c27106e894c11e30944798c7aa25f6c57fb884656332992d8d WatchSource:0}: Error finding container a1cd8d0ff9c9c6c27106e894c11e30944798c7aa25f6c57fb884656332992d8d: Status 404 returned error can't find the container with id a1cd8d0ff9c9c6c27106e894c11e30944798c7aa25f6c57fb884656332992d8d Apr 19 15:30:17.847107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:17.847070 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" event={"ID":"4395055e-f594-437e-8773-9fc927742701","Type":"ContainerStarted","Data":"a1cd8d0ff9c9c6c27106e894c11e30944798c7aa25f6c57fb884656332992d8d"} Apr 19 15:30:18.273778 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.273747 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm"] Apr 19 15:30:18.276562 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.276542 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.278902 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.278885 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-xd27t\"" Apr 19 15:30:18.280086 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.280066 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 19 15:30:18.280086 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.280081 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 15:30:18.280252 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.280066 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 15:30:18.280252 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.280068 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 19 15:30:18.285884 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.285862 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm"] Apr 19 15:30:18.431297 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.431264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95c31735-4f02-453a-97bb-32db7c2850d7-tmp\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.431459 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.431324 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95c31735-4f02-453a-97bb-32db7c2850d7-tls-certs\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.431459 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.431389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkbj\" (UniqueName: \"kubernetes.io/projected/95c31735-4f02-453a-97bb-32db7c2850d7-kube-api-access-rfkbj\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.532693 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.532591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95c31735-4f02-453a-97bb-32db7c2850d7-tmp\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.532854 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.532804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95c31735-4f02-453a-97bb-32db7c2850d7-tls-certs\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.532936 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.532917 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkbj\" (UniqueName: \"kubernetes.io/projected/95c31735-4f02-453a-97bb-32db7c2850d7-kube-api-access-rfkbj\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.535813 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.535747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95c31735-4f02-453a-97bb-32db7c2850d7-tls-certs\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.536125 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.536087 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95c31735-4f02-453a-97bb-32db7c2850d7-tmp\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.540905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.540883 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkbj\" (UniqueName: \"kubernetes.io/projected/95c31735-4f02-453a-97bb-32db7c2850d7-kube-api-access-rfkbj\") pod \"kube-auth-proxy-745fb64bc5-kj7bm\" (UID: \"95c31735-4f02-453a-97bb-32db7c2850d7\") " pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.587745 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.587710 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" Apr 19 15:30:18.719559 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.719527 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm"] Apr 19 15:30:18.724017 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:30:18.723986 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c31735_4f02_453a_97bb_32db7c2850d7.slice/crio-4ed1e530d48767cb41636a938fdd2fcd5cb9dce614d94db4b599032f3652c3a4 WatchSource:0}: Error finding container 4ed1e530d48767cb41636a938fdd2fcd5cb9dce614d94db4b599032f3652c3a4: Status 404 returned error can't find the container with id 4ed1e530d48767cb41636a938fdd2fcd5cb9dce614d94db4b599032f3652c3a4 Apr 19 15:30:18.851794 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:18.851705 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" event={"ID":"95c31735-4f02-453a-97bb-32db7c2850d7","Type":"ContainerStarted","Data":"4ed1e530d48767cb41636a938fdd2fcd5cb9dce614d94db4b599032f3652c3a4"} Apr 19 15:30:20.860034 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:20.859976 2564 generic.go:358] "Generic (PLEG): container finished" podID="4395055e-f594-437e-8773-9fc927742701" containerID="94117082d8c2803cb39dc73a6bdff48cceb1b158ad36a89e507556ba3f2d636c" exitCode=1 Apr 19 15:30:20.860454 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:20.860078 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" event={"ID":"4395055e-f594-437e-8773-9fc927742701","Type":"ContainerDied","Data":"94117082d8c2803cb39dc73a6bdff48cceb1b158ad36a89e507556ba3f2d636c"} Apr 19 15:30:20.860454 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:20.860410 2564 scope.go:117] "RemoveContainer" containerID="94117082d8c2803cb39dc73a6bdff48cceb1b158ad36a89e507556ba3f2d636c" Apr 19 15:30:21.651695 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.651617 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-4dqqr"] Apr 19 15:30:21.658449 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.658416 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:21.660776 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.660751 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-4dqqr"] Apr 19 15:30:21.661731 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.661490 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 19 15:30:21.661731 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.661531 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-zsmqs\"" Apr 19 15:30:21.758202 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.758131 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:21.758599 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.758489 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5km\" (UniqueName: \"kubernetes.io/projected/a165459e-29e1-4ee9-bec5-5463ea0f6f21-kube-api-access-jt5km\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:21.859600 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.859565 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:21.859797 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.859615 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5km\" (UniqueName: \"kubernetes.io/projected/a165459e-29e1-4ee9-bec5-5463ea0f6f21-kube-api-access-jt5km\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:21.859797 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:30:21.859762 2564 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 19 15:30:21.859892 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:30:21.859831 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert podName:a165459e-29e1-4ee9-bec5-5463ea0f6f21 nodeName:}" failed. No retries permitted until 2026-04-19 15:30:22.359810443 +0000 UTC m=+381.159725981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert") pod "kserve-controller-manager-856948b99f-4dqqr" (UID: "a165459e-29e1-4ee9-bec5-5463ea0f6f21") : secret "kserve-webhook-server-cert" not found Apr 19 15:30:21.870937 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:21.870908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5km\" (UniqueName: \"kubernetes.io/projected/a165459e-29e1-4ee9-bec5-5463ea0f6f21-kube-api-access-jt5km\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:22.364387 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.364338 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:22.367289 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.367257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a165459e-29e1-4ee9-bec5-5463ea0f6f21-cert\") pod \"kserve-controller-manager-856948b99f-4dqqr\" (UID: \"a165459e-29e1-4ee9-bec5-5463ea0f6f21\") " pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:22.571440 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.571407 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:22.689792 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.689760 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-4dqqr"] Apr 19 15:30:22.692716 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:30:22.692691 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda165459e_29e1_4ee9_bec5_5463ea0f6f21.slice/crio-75c36aca878f41a162a55256e4df8c4c04288f0568e408570e67cae95a790041 WatchSource:0}: Error finding container 75c36aca878f41a162a55256e4df8c4c04288f0568e408570e67cae95a790041: Status 404 returned error can't find the container with id 75c36aca878f41a162a55256e4df8c4c04288f0568e408570e67cae95a790041 Apr 19 15:30:22.871498 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.868218 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" event={"ID":"95c31735-4f02-453a-97bb-32db7c2850d7","Type":"ContainerStarted","Data":"664759fc64203b4026d1866464873e57bf93456f641bc80d6a6a9498ee36d986"} Apr 19 15:30:22.874902 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.874875 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" event={"ID":"a165459e-29e1-4ee9-bec5-5463ea0f6f21","Type":"ContainerStarted","Data":"75c36aca878f41a162a55256e4df8c4c04288f0568e408570e67cae95a790041"} Apr 19 15:30:22.876654 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.876618 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" event={"ID":"4395055e-f594-437e-8773-9fc927742701","Type":"ContainerStarted","Data":"1fedf71effd257631c7b9155d7ebc66c073c716f3231f4bbd936832ee7b75063"} Apr 19 15:30:22.876797 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.876785 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:22.905836 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.905791 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-745fb64bc5-kj7bm" podStartSLOduration=1.186625474 podStartE2EDuration="4.90577726s" podCreationTimestamp="2026-04-19 15:30:18 +0000 UTC" firstStartedPulling="2026-04-19 15:30:18.726684835 +0000 UTC m=+377.526600372" lastFinishedPulling="2026-04-19 15:30:22.445836617 +0000 UTC m=+381.245752158" observedRunningTime="2026-04-19 15:30:22.904123059 +0000 UTC m=+381.704038630" watchObservedRunningTime="2026-04-19 15:30:22.90577726 +0000 UTC m=+381.705692816" Apr 19 15:30:22.931830 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:22.931785 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" podStartSLOduration=2.44432124 podStartE2EDuration="7.931771397s" podCreationTimestamp="2026-04-19 15:30:15 +0000 UTC" firstStartedPulling="2026-04-19 15:30:16.956544209 +0000 UTC m=+375.756459745" lastFinishedPulling="2026-04-19 15:30:22.443994354 +0000 UTC m=+381.243909902" observedRunningTime="2026-04-19 15:30:22.929807692 +0000 UTC m=+381.729723251" watchObservedRunningTime="2026-04-19 15:30:22.931771397 +0000 UTC m=+381.731686956" Apr 19 15:30:25.888098 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:25.888062 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" event={"ID":"a165459e-29e1-4ee9-bec5-5463ea0f6f21","Type":"ContainerStarted","Data":"8db64bc8751451d27bd1796c4bf90630336c4c58364e0ad858199fdba411bf5e"} Apr 19 15:30:25.888098 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:25.888120 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:30:25.902826 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:25.902770 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" podStartSLOduration=2.396649619 podStartE2EDuration="4.90275342s" podCreationTimestamp="2026-04-19 15:30:21 +0000 UTC" firstStartedPulling="2026-04-19 15:30:22.694001164 +0000 UTC m=+381.493916700" lastFinishedPulling="2026-04-19 15:30:25.200104959 +0000 UTC m=+384.000020501" observedRunningTime="2026-04-19 15:30:25.902345762 +0000 UTC m=+384.702261322" watchObservedRunningTime="2026-04-19 15:30:25.90275342 +0000 UTC m=+384.702668980" Apr 19 15:30:33.448990 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.448951 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9"] Apr 19 15:30:33.458193 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.458167 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.460910 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.460879 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 19 15:30:33.460910 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.460900 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 19 15:30:33.461115 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.460905 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-m8zlq\"" Apr 19 15:30:33.467035 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.467005 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9"] Apr 19 15:30:33.552096 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.552060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqqj\" (UniqueName: \"kubernetes.io/projected/18c49a7c-a69a-40a2-8389-97304e717ac0-kube-api-access-5nqqj\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.552262 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.552131 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/18c49a7c-a69a-40a2-8389-97304e717ac0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.652651 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.652596 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/18c49a7c-a69a-40a2-8389-97304e717ac0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.652810 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.652682 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqqj\" (UniqueName: \"kubernetes.io/projected/18c49a7c-a69a-40a2-8389-97304e717ac0-kube-api-access-5nqqj\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.655185 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.655159 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/18c49a7c-a69a-40a2-8389-97304e717ac0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.665439 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.665413 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqqj\" (UniqueName: \"kubernetes.io/projected/18c49a7c-a69a-40a2-8389-97304e717ac0-kube-api-access-5nqqj\") pod \"servicemesh-operator3-55f49c5f94-gnbk9\" (UID: \"18c49a7c-a69a-40a2-8389-97304e717ac0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.768018 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.767981 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:33.883579 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.883553 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-9vd7p" Apr 19 15:30:33.895162 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.894766 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9"] Apr 19 15:30:33.898116 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:30:33.898086 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c49a7c_a69a_40a2_8389_97304e717ac0.slice/crio-558fd630318665109ed015779d731d7296ccb79fd685c4dc09e58a4dc61dc01b WatchSource:0}: Error finding container 558fd630318665109ed015779d731d7296ccb79fd685c4dc09e58a4dc61dc01b: Status 404 returned error can't find the container with id 558fd630318665109ed015779d731d7296ccb79fd685c4dc09e58a4dc61dc01b Apr 19 15:30:33.911862 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:33.911838 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" event={"ID":"18c49a7c-a69a-40a2-8389-97304e717ac0","Type":"ContainerStarted","Data":"558fd630318665109ed015779d731d7296ccb79fd685c4dc09e58a4dc61dc01b"} Apr 19 15:30:37.926719 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:37.926678 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" event={"ID":"18c49a7c-a69a-40a2-8389-97304e717ac0","Type":"ContainerStarted","Data":"0b69f4d709ce8c43792a4c3e0e91fc276e194e88e08a6c8b746c2797361cf848"} Apr 19 15:30:37.927067 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:37.926809 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:37.951613 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:37.950257 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" podStartSLOduration=1.758870767 podStartE2EDuration="4.950236069s" podCreationTimestamp="2026-04-19 15:30:33 +0000 UTC" firstStartedPulling="2026-04-19 15:30:33.900470054 +0000 UTC m=+392.700385591" lastFinishedPulling="2026-04-19 15:30:37.091835352 +0000 UTC m=+395.891750893" observedRunningTime="2026-04-19 15:30:37.946519875 +0000 UTC m=+396.746435434" watchObservedRunningTime="2026-04-19 15:30:37.950236069 +0000 UTC m=+396.750151628" Apr 19 15:30:48.579328 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.579291 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624"] Apr 19 15:30:48.582712 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.582689 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.585216 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.585195 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 19 15:30:48.585326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.585279 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 15:30:48.585326 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.585303 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-hwmdr\"" Apr 19 15:30:48.585442 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.585327 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 19 15:30:48.585582 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.585566 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 19 15:30:48.590801 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.590778 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624"] Apr 19 15:30:48.667046 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667218 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667063 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667218 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667138 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667218 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667176 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgmp\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-kube-api-access-jvgmp\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667319 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667319 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.667383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.667338 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768361 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768361 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768367 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768560 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768560 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768527 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768560 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768553 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgmp\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-kube-api-access-jvgmp\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768734 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768600 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.768734 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.768663 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.769279 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.769227 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.770925 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.770897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.771049 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.771001 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.771328 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.771306 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.771447 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.771427 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.775816 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.775789 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.776029 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.776011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgmp\" (UniqueName: \"kubernetes.io/projected/e7720a6e-57d9-40f6-a6ec-ed93bcc9881e-kube-api-access-jvgmp\") pod \"istiod-openshift-gateway-55ff986f96-j4624\" (UID: \"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.892562 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.892464 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:48.932501 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:48.932472 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnbk9" Apr 19 15:30:49.026034 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:49.025938 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624"] Apr 19 15:30:49.029286 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:30:49.029253 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7720a6e_57d9_40f6_a6ec_ed93bcc9881e.slice/crio-671e28171f312225313e97bb25b2d0c99b6e2b3d16c39fbe742c07857d5d207a WatchSource:0}: Error finding container 671e28171f312225313e97bb25b2d0c99b6e2b3d16c39fbe742c07857d5d207a: Status 404 returned error can't find the container with id 671e28171f312225313e97bb25b2d0c99b6e2b3d16c39fbe742c07857d5d207a Apr 19 15:30:49.965257 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:49.965214 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" event={"ID":"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e","Type":"ContainerStarted","Data":"671e28171f312225313e97bb25b2d0c99b6e2b3d16c39fbe742c07857d5d207a"} Apr 19 15:30:51.880252 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:51.880198 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 15:30:51.880577 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:51.880298 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 15:30:52.979361 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:52.979314 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" event={"ID":"e7720a6e-57d9-40f6-a6ec-ed93bcc9881e","Type":"ContainerStarted","Data":"15f08c499c98ddfe5b62b3982f0fce90a857e8217134f26759f0149366ba0f18"} Apr 19 15:30:52.979809 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:52.979548 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:52.981171 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:52.981144 2564 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-j4624 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 19 15:30:52.981333 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:52.981196 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" podUID="e7720a6e-57d9-40f6-a6ec-ed93bcc9881e" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 15:30:52.998786 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:52.998732 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" podStartSLOduration=2.150036588 podStartE2EDuration="4.998718696s" podCreationTimestamp="2026-04-19 15:30:48 +0000 UTC" firstStartedPulling="2026-04-19 15:30:49.031263981 +0000 UTC m=+407.831179518" lastFinishedPulling="2026-04-19 15:30:51.87994609 +0000 UTC m=+410.679861626" observedRunningTime="2026-04-19 15:30:52.99743713 +0000 UTC m=+411.797352692" watchObservedRunningTime="2026-04-19 15:30:52.998718696 +0000 UTC m=+411.798634255" Apr 19 15:30:53.982685 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:53.982648 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-j4624" Apr 19 15:30:56.896962 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:30:56.896933 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-4dqqr" Apr 19 15:31:47.696103 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.696071 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw"] Apr 19 15:31:47.699153 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.699137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:47.701885 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.701864 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 15:31:47.702013 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.701915 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4629x\"" Apr 19 15:31:47.702916 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.702899 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 15:31:47.709503 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.709479 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw"] Apr 19 15:31:47.723393 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.723368 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hm6\" (UniqueName: \"kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6\") pod \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" (UID: \"b74667d0-89d8-4170-8e49-1f645dbfa3da\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:47.824133 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.824103 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hm6\" (UniqueName: \"kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6\") pod \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" (UID: \"b74667d0-89d8-4170-8e49-1f645dbfa3da\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:47.847601 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:47.847536 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hm6\" (UniqueName: \"kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6\") pod \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" (UID: \"b74667d0-89d8-4170-8e49-1f645dbfa3da\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:48.010154 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:48.010123 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:48.157694 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:48.157665 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw"] Apr 19 15:31:48.162248 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:31:48.162202 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74667d0_89d8_4170_8e49_1f645dbfa3da.slice/crio-e270c932fea5defa51010ae111b02b634e7de62a867053ccfb695532c3fe205a WatchSource:0}: Error finding container e270c932fea5defa51010ae111b02b634e7de62a867053ccfb695532c3fe205a: Status 404 returned error can't find the container with id e270c932fea5defa51010ae111b02b634e7de62a867053ccfb695532c3fe205a Apr 19 15:31:49.151737 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:49.151693 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" event={"ID":"b74667d0-89d8-4170-8e49-1f645dbfa3da","Type":"ContainerStarted","Data":"e270c932fea5defa51010ae111b02b634e7de62a867053ccfb695532c3fe205a"} Apr 19 15:31:51.159439 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:51.159402 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" event={"ID":"b74667d0-89d8-4170-8e49-1f645dbfa3da","Type":"ContainerStarted","Data":"c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782"} Apr 19 15:31:51.159900 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:51.159526 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:31:51.176199 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:51.176147 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" podStartSLOduration=2.172956566 podStartE2EDuration="4.176131238s" podCreationTimestamp="2026-04-19 15:31:47 +0000 UTC" firstStartedPulling="2026-04-19 15:31:48.164164467 +0000 UTC m=+466.964080004" lastFinishedPulling="2026-04-19 15:31:50.167339139 +0000 UTC m=+468.967254676" observedRunningTime="2026-04-19 15:31:51.175371191 +0000 UTC m=+469.975286750" watchObservedRunningTime="2026-04-19 15:31:51.176131238 +0000 UTC m=+469.976046804" Apr 19 15:31:55.904673 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:55.904618 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cvwvk"] Apr 19 15:31:55.908023 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:55.908001 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:55.912363 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:55.912341 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-bsmtn\"" Apr 19 15:31:55.917178 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:55.917155 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cvwvk"] Apr 19 15:31:55.983884 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:55.983849 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmqc\" (UniqueName: \"kubernetes.io/projected/6e794a37-05bf-41a9-a8c5-4812048b9ede-kube-api-access-cjmqc\") pod \"authorino-operator-657f44b778-cvwvk\" (UID: \"6e794a37-05bf-41a9-a8c5-4812048b9ede\") " pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:56.084889 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:56.084852 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmqc\" (UniqueName: \"kubernetes.io/projected/6e794a37-05bf-41a9-a8c5-4812048b9ede-kube-api-access-cjmqc\") pod \"authorino-operator-657f44b778-cvwvk\" (UID: \"6e794a37-05bf-41a9-a8c5-4812048b9ede\") " pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:56.098436 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:56.098405 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmqc\" (UniqueName: \"kubernetes.io/projected/6e794a37-05bf-41a9-a8c5-4812048b9ede-kube-api-access-cjmqc\") pod \"authorino-operator-657f44b778-cvwvk\" (UID: \"6e794a37-05bf-41a9-a8c5-4812048b9ede\") " pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:56.218887 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:56.218806 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:56.341905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:56.341860 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cvwvk"] Apr 19 15:31:56.345794 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:31:56.345766 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e794a37_05bf_41a9_a8c5_4812048b9ede.slice/crio-f6c144ba0a93d277a2e2c802ceee81efce0bce381b14ec321a567676b17be053 WatchSource:0}: Error finding container f6c144ba0a93d277a2e2c802ceee81efce0bce381b14ec321a567676b17be053: Status 404 returned error can't find the container with id f6c144ba0a93d277a2e2c802ceee81efce0bce381b14ec321a567676b17be053 Apr 19 15:31:57.179465 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:57.179425 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" event={"ID":"6e794a37-05bf-41a9-a8c5-4812048b9ede","Type":"ContainerStarted","Data":"f6c144ba0a93d277a2e2c802ceee81efce0bce381b14ec321a567676b17be053"} Apr 19 15:31:58.183772 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:58.183682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" event={"ID":"6e794a37-05bf-41a9-a8c5-4812048b9ede","Type":"ContainerStarted","Data":"e4a903920bd6ac45b996947d33ca14ab64037963407a41698a1ede7ecbdc0e80"} Apr 19 15:31:58.183772 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:58.183731 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:31:58.203484 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:31:58.203402 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" podStartSLOduration=1.7457658660000002 podStartE2EDuration="3.203382451s" podCreationTimestamp="2026-04-19 15:31:55 +0000 UTC" firstStartedPulling="2026-04-19 15:31:56.347761521 +0000 UTC m=+475.147677061" lastFinishedPulling="2026-04-19 15:31:57.805378109 +0000 UTC m=+476.605293646" observedRunningTime="2026-04-19 15:31:58.200023257 +0000 UTC m=+476.999938817" watchObservedRunningTime="2026-04-19 15:31:58.203382451 +0000 UTC m=+477.003298013" Apr 19 15:32:02.164251 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:02.164219 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:32:09.189230 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:09.189199 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-cvwvk" Apr 19 15:32:10.933523 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.933487 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw"] Apr 19 15:32:10.933925 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.933727 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" containerName="manager" containerID="cri-o://c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782" gracePeriod=2 Apr 19 15:32:10.945045 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.945018 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw"] Apr 19 15:32:10.963947 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.963919 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h"] Apr 19 15:32:10.964264 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.964246 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" containerName="manager" Apr 19 15:32:10.964380 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.964268 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" containerName="manager" Apr 19 15:32:10.964380 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.964363 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" containerName="manager" Apr 19 15:32:10.967183 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.967160 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:10.979767 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:10.979741 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h"] Apr 19 15:32:11.004699 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.004667 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj995\" (UniqueName: \"kubernetes.io/projected/33f6475b-3073-4506-a028-7e5bace8df0c-kube-api-access-nj995\") pod \"limitador-operator-controller-manager-85c4996f8c-rbx6h\" (UID: \"33f6475b-3073-4506-a028-7e5bace8df0c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:11.105899 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.105865 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj995\" (UniqueName: \"kubernetes.io/projected/33f6475b-3073-4506-a028-7e5bace8df0c-kube-api-access-nj995\") pod \"limitador-operator-controller-manager-85c4996f8c-rbx6h\" (UID: \"33f6475b-3073-4506-a028-7e5bace8df0c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:11.119624 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.119598 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj995\" (UniqueName: \"kubernetes.io/projected/33f6475b-3073-4506-a028-7e5bace8df0c-kube-api-access-nj995\") pod \"limitador-operator-controller-manager-85c4996f8c-rbx6h\" (UID: \"33f6475b-3073-4506-a028-7e5bace8df0c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:11.158875 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.158853 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:32:11.161023 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.160996 2564 status_manager.go:895] "Failed to get status for pod" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" err="pods \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" is forbidden: User \"system:node:ip-10-0-131-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-48.ec2.internal' and this object" Apr 19 15:32:11.206481 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.206397 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8hm6\" (UniqueName: \"kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6\") pod \"b74667d0-89d8-4170-8e49-1f645dbfa3da\" (UID: \"b74667d0-89d8-4170-8e49-1f645dbfa3da\") " Apr 19 15:32:11.208527 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.208499 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6" (OuterVolumeSpecName: "kube-api-access-k8hm6") pod "b74667d0-89d8-4170-8e49-1f645dbfa3da" (UID: "b74667d0-89d8-4170-8e49-1f645dbfa3da"). InnerVolumeSpecName "kube-api-access-k8hm6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:11.226107 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.226077 2564 generic.go:358] "Generic (PLEG): container finished" podID="b74667d0-89d8-4170-8e49-1f645dbfa3da" containerID="c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782" exitCode=0 Apr 19 15:32:11.226211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.226122 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" Apr 19 15:32:11.226211 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.226174 2564 scope.go:117] "RemoveContainer" containerID="c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782" Apr 19 15:32:11.229305 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.229280 2564 status_manager.go:895] "Failed to get status for pod" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" err="pods \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" is forbidden: User \"system:node:ip-10-0-131-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-48.ec2.internal' and this object" Apr 19 15:32:11.233876 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.233857 2564 scope.go:117] "RemoveContainer" containerID="c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782" Apr 19 15:32:11.234119 ip-10-0-131-48 kubenswrapper[2564]: E0419 15:32:11.234097 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782\": container with ID starting with c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782 not found: ID does not exist" containerID="c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782" Apr 19 15:32:11.234194 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.234125 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782"} err="failed to get container status \"c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782\": rpc error: code = NotFound desc = could not find container \"c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782\": container with ID starting with c5d1ef6c570a963a88bfa09726c5bc86834581b9924656e81561789a8150b782 not found: ID does not exist" Apr 19 15:32:11.237022 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.236999 2564 status_manager.go:895] "Failed to get status for pod" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" err="pods \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" is forbidden: User \"system:node:ip-10-0-131-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-48.ec2.internal' and this object" Apr 19 15:32:11.307529 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.307494 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8hm6\" (UniqueName: \"kubernetes.io/projected/b74667d0-89d8-4170-8e49-1f645dbfa3da-kube-api-access-k8hm6\") on node \"ip-10-0-131-48.ec2.internal\" DevicePath \"\"" Apr 19 15:32:11.312458 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.312435 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:11.640680 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.640619 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h"] Apr 19 15:32:11.642812 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:32:11.642787 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f6475b_3073_4506_a028_7e5bace8df0c.slice/crio-b1d6dac613107a3eaf7c0a069a1fd7577cce6fdea4ee33685ccea079d267c492 WatchSource:0}: Error finding container b1d6dac613107a3eaf7c0a069a1fd7577cce6fdea4ee33685ccea079d267c492: Status 404 returned error can't find the container with id b1d6dac613107a3eaf7c0a069a1fd7577cce6fdea4ee33685ccea079d267c492 Apr 19 15:32:11.666448 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.666422 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" path="/var/lib/kubelet/pods/b74667d0-89d8-4170-8e49-1f645dbfa3da/volumes" Apr 19 15:32:11.666859 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:11.666824 2564 status_manager.go:895] "Failed to get status for pod" podUID="b74667d0-89d8-4170-8e49-1f645dbfa3da" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-w2dxw" err="pods \"limitador-operator-controller-manager-85c4996f8c-w2dxw\" is forbidden: User \"system:node:ip-10-0-131-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-48.ec2.internal' and this object" Apr 19 15:32:12.231474 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:12.231435 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" event={"ID":"33f6475b-3073-4506-a028-7e5bace8df0c","Type":"ContainerStarted","Data":"1048397df6a8ca2ede53cfc5153bbf896ebfc84413007658ab48f660f3560563"} Apr 19 15:32:12.231474 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:12.231478 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" event={"ID":"33f6475b-3073-4506-a028-7e5bace8df0c","Type":"ContainerStarted","Data":"b1d6dac613107a3eaf7c0a069a1fd7577cce6fdea4ee33685ccea079d267c492"} Apr 19 15:32:12.231905 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:12.231671 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:32:12.258413 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:12.258373 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" podStartSLOduration=2.258360766 podStartE2EDuration="2.258360766s" podCreationTimestamp="2026-04-19 15:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:12.258322103 +0000 UTC m=+491.058237663" watchObservedRunningTime="2026-04-19 15:32:12.258360766 +0000 UTC m=+491.058276325" Apr 19 15:32:23.237445 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:32:23.237411 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rbx6h" Apr 19 15:33:28.191063 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.191028 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-44n94"] Apr 19 15:33:28.198670 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.198642 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.202347 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.202325 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 19 15:33:28.202475 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.202330 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-fbljh\"" Apr 19 15:33:28.208262 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.208236 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-44n94"] Apr 19 15:33:28.299238 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.299199 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86b8\" (UniqueName: \"kubernetes.io/projected/f7fd1c88-5ca6-4183-927c-777305a31a89-kube-api-access-c86b8\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.299574 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.299249 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7fd1c88-5ca6-4183-927c-777305a31a89-data\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.399899 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.399873 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c86b8\" (UniqueName: \"kubernetes.io/projected/f7fd1c88-5ca6-4183-927c-777305a31a89-kube-api-access-c86b8\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.400044 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.399909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7fd1c88-5ca6-4183-927c-777305a31a89-data\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.400220 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.400206 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7fd1c88-5ca6-4183-927c-777305a31a89-data\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.408789 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.408761 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86b8\" (UniqueName: \"kubernetes.io/projected/f7fd1c88-5ca6-4183-927c-777305a31a89-kube-api-access-c86b8\") pod \"postgres-868db5846d-44n94\" (UID: \"f7fd1c88-5ca6-4183-927c-777305a31a89\") " pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.510709 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.510687 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:28.628100 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:28.628073 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-44n94"] Apr 19 15:33:28.630537 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:33:28.630506 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7fd1c88_5ca6_4183_927c_777305a31a89.slice/crio-25d90673842e0ed594451a3a58637e55dcfad9805ea93ed74c0205336bf68ecd WatchSource:0}: Error finding container 25d90673842e0ed594451a3a58637e55dcfad9805ea93ed74c0205336bf68ecd: Status 404 returned error can't find the container with id 25d90673842e0ed594451a3a58637e55dcfad9805ea93ed74c0205336bf68ecd Apr 19 15:33:29.478360 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:29.478324 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-44n94" event={"ID":"f7fd1c88-5ca6-4183-927c-777305a31a89","Type":"ContainerStarted","Data":"25d90673842e0ed594451a3a58637e55dcfad9805ea93ed74c0205336bf68ecd"} Apr 19 15:33:34.499721 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:34.499686 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-44n94" event={"ID":"f7fd1c88-5ca6-4183-927c-777305a31a89","Type":"ContainerStarted","Data":"c02345b4c9eaa96e5cfc5dc305ec6f56809b5a48a0e5b5ae65457dc65f1509b0"} Apr 19 15:33:34.500131 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:34.499824 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:33:34.515964 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:34.515922 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-44n94" podStartSLOduration=1.529838114 podStartE2EDuration="6.515908142s" podCreationTimestamp="2026-04-19 15:33:28 +0000 UTC" firstStartedPulling="2026-04-19 15:33:28.631757771 +0000 UTC m=+567.431673308" lastFinishedPulling="2026-04-19 15:33:33.617827799 +0000 UTC m=+572.417743336" observedRunningTime="2026-04-19 15:33:34.514519133 +0000 UTC m=+573.314434692" watchObservedRunningTime="2026-04-19 15:33:34.515908142 +0000 UTC m=+573.315823741" Apr 19 15:33:40.530665 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:33:40.530620 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-44n94" Apr 19 15:34:19.281564 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.281526 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-66b65d79c8-d5zvh"] Apr 19 15:34:19.284666 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.284627 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.287184 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.287157 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 19 15:34:19.288268 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.288227 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rcvbl\"" Apr 19 15:34:19.288380 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.288239 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 19 15:34:19.295385 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.295356 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66b65d79c8-d5zvh"] Apr 19 15:34:19.307969 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.307945 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdzs\" (UniqueName: \"kubernetes.io/projected/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-kube-api-access-6wdzs\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.308061 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.308001 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-maas-api-tls\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.408835 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.408798 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdzs\" (UniqueName: \"kubernetes.io/projected/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-kube-api-access-6wdzs\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.409008 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.408867 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-maas-api-tls\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.411361 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.411336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-maas-api-tls\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.416060 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.416030 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdzs\" (UniqueName: \"kubernetes.io/projected/2b3ec3b4-1395-4fec-87d6-78df803ff3f6-kube-api-access-6wdzs\") pod \"maas-api-66b65d79c8-d5zvh\" (UID: \"2b3ec3b4-1395-4fec-87d6-78df803ff3f6\") " pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.596131 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.596053 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:19.718377 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:19.718153 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66b65d79c8-d5zvh"] Apr 19 15:34:19.720814 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:34:19.720786 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3ec3b4_1395_4fec_87d6_78df803ff3f6.slice/crio-7524e3f1d9ba98bd34d74cf55c5ffa1b96d077707f644360d84b7b45fbf2a21e WatchSource:0}: Error finding container 7524e3f1d9ba98bd34d74cf55c5ffa1b96d077707f644360d84b7b45fbf2a21e: Status 404 returned error can't find the container with id 7524e3f1d9ba98bd34d74cf55c5ffa1b96d077707f644360d84b7b45fbf2a21e Apr 19 15:34:20.657940 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:20.657878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b65d79c8-d5zvh" event={"ID":"2b3ec3b4-1395-4fec-87d6-78df803ff3f6","Type":"ContainerStarted","Data":"7524e3f1d9ba98bd34d74cf55c5ffa1b96d077707f644360d84b7b45fbf2a21e"} Apr 19 15:34:22.666537 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:22.665782 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b65d79c8-d5zvh" event={"ID":"2b3ec3b4-1395-4fec-87d6-78df803ff3f6","Type":"ContainerStarted","Data":"b807d3247d43981b74a22e55787d2a9428ad5aaf28499539418325a9395b45b0"} Apr 19 15:34:22.666537 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:22.666505 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:34:22.684809 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:22.684762 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-66b65d79c8-d5zvh" podStartSLOduration=1.442862299 podStartE2EDuration="3.684750689s" podCreationTimestamp="2026-04-19 15:34:19 +0000 UTC" firstStartedPulling="2026-04-19 15:34:19.722043971 +0000 UTC m=+618.521959508" lastFinishedPulling="2026-04-19 15:34:21.96393236 +0000 UTC m=+620.763847898" observedRunningTime="2026-04-19 15:34:22.682617776 +0000 UTC m=+621.482533348" watchObservedRunningTime="2026-04-19 15:34:22.684750689 +0000 UTC m=+621.484666248" Apr 19 15:34:29.679313 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:34:29.679284 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-66b65d79c8-d5zvh" Apr 19 15:58:13.601349 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:13.601308 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-4dqqr_a165459e-29e1-4ee9-bec5-5463ea0f6f21/manager/0.log" Apr 19 15:58:13.710204 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:13.710156 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-66b65d79c8-d5zvh_2b3ec3b4-1395-4fec-87d6-78df803ff3f6/maas-api/0.log" Apr 19 15:58:13.938163 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:13.938092 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9vd7p_4395055e-f594-437e-8773-9fc927742701/manager/1.log" Apr 19 15:58:14.274081 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:14.274052 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-67944f454b-h7kh2_5e7a403b-d82f-4088-93a8-a4cc60d4be4a/manager/0.log" Apr 19 15:58:14.380841 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:14.380810 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-44n94_f7fd1c88-5ca6-4183-927c-777305a31a89/postgres/0.log" Apr 19 15:58:15.665184 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:15.665155 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-cvwvk_6e794a37-05bf-41a9-a8c5-4812048b9ede/manager/0.log" Apr 19 15:58:16.311371 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:16.311342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rbx6h_33f6475b-3073-4506-a028-7e5bace8df0c/manager/0.log" Apr 19 15:58:16.737176 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:16.737101 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-j4624_e7720a6e-57d9-40f6-a6ec-ed93bcc9881e/discovery/0.log" Apr 19 15:58:16.937227 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:16.937202 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-745fb64bc5-kj7bm_95c31735-4f02-453a-97bb-32db7c2850d7/kube-auth-proxy/0.log" Apr 19 15:58:24.433100 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:24.433068 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-p2kvw_16c2a1f0-b512-4202-a476-b96d67bc2fcf/global-pull-secret-syncer/0.log" Apr 19 15:58:24.538079 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:24.538051 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fsfjs_848efa24-3ed5-46b7-b923-74011caa024a/konnectivity-agent/0.log" Apr 19 15:58:24.615874 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:24.615846 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-48.ec2.internal_b98e067c43f6f3381b105d98ca711c33/haproxy/0.log" Apr 19 15:58:28.540176 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:28.540143 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-cvwvk_6e794a37-05bf-41a9-a8c5-4812048b9ede/manager/0.log" Apr 19 15:58:28.762354 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:28.762324 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rbx6h_33f6475b-3073-4506-a028-7e5bace8df0c/manager/0.log" Apr 19 15:58:30.629178 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:30.629151 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qcbcj_f31b0631-e29f-40f1-9099-bc910d8074ad/node-exporter/0.log" Apr 19 15:58:30.649425 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:30.649404 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qcbcj_f31b0631-e29f-40f1-9099-bc910d8074ad/kube-rbac-proxy/0.log" Apr 19 15:58:30.670924 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:30.670901 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qcbcj_f31b0631-e29f-40f1-9099-bc910d8074ad/init-textfile/0.log" Apr 19 15:58:32.980745 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.980706 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr"] Apr 19 15:58:32.983827 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.983807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:32.986450 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.986426 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w8x7s\"/\"openshift-service-ca.crt\"" Apr 19 15:58:32.986668 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.986455 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w8x7s\"/\"kube-root-ca.crt\"" Apr 19 15:58:32.987551 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.987532 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w8x7s\"/\"default-dockercfg-775qz\"" Apr 19 15:58:32.992216 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:32.992184 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr"] Apr 19 15:58:33.121088 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.121051 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-sys\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.121275 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.121110 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-podres\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.121275 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.121136 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxx6d\" (UniqueName: \"kubernetes.io/projected/ce14bf26-e42d-422e-b06f-91720e6eba40-kube-api-access-wxx6d\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.121275 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.121155 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-proc\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.121275 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.121184 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-lib-modules\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222003 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.221970 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-sys\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222166 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222025 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-podres\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222166 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222108 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-sys\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222166 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222128 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-podres\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222166 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222141 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxx6d\" (UniqueName: \"kubernetes.io/projected/ce14bf26-e42d-422e-b06f-91720e6eba40-kube-api-access-wxx6d\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222346 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-proc\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222346 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222216 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-lib-modules\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222346 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-proc\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.222452 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.222361 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce14bf26-e42d-422e-b06f-91720e6eba40-lib-modules\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.229617 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.229593 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxx6d\" (UniqueName: \"kubernetes.io/projected/ce14bf26-e42d-422e-b06f-91720e6eba40-kube-api-access-wxx6d\") pod \"perf-node-gather-daemonset-tqpsr\" (UID: \"ce14bf26-e42d-422e-b06f-91720e6eba40\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.294362 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.294333 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:33.412013 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.411990 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr"] Apr 19 15:58:33.414274 ip-10-0-131-48 kubenswrapper[2564]: W0419 15:58:33.414252 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podce14bf26_e42d_422e_b06f_91720e6eba40.slice/crio-bca6bb3e42ddee21905a0cef7e8806c8acbbe561ce413e9881aa699b32359237 WatchSource:0}: Error finding container bca6bb3e42ddee21905a0cef7e8806c8acbbe561ce413e9881aa699b32359237: Status 404 returned error can't find the container with id bca6bb3e42ddee21905a0cef7e8806c8acbbe561ce413e9881aa699b32359237 Apr 19 15:58:33.416187 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:33.416162 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:58:34.318963 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.318928 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" event={"ID":"ce14bf26-e42d-422e-b06f-91720e6eba40","Type":"ContainerStarted","Data":"1939ab7eb070f7e2d1ec9c4f9419fce4a68d6227cd803cd0b6d81a33cca9e441"} Apr 19 15:58:34.318963 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.318965 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" event={"ID":"ce14bf26-e42d-422e-b06f-91720e6eba40","Type":"ContainerStarted","Data":"bca6bb3e42ddee21905a0cef7e8806c8acbbe561ce413e9881aa699b32359237"} Apr 19 15:58:34.319393 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.319071 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:34.334809 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.334763 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" podStartSLOduration=2.334749967 podStartE2EDuration="2.334749967s" podCreationTimestamp="2026-04-19 15:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:58:34.333429073 +0000 UTC m=+2073.133344842" watchObservedRunningTime="2026-04-19 15:58:34.334749967 +0000 UTC m=+2073.134665525" Apr 19 15:58:34.744600 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.744525 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-frphc_7577782b-d81a-428d-abba-4b2f85606b5a/dns/0.log" Apr 19 15:58:34.773403 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.769111 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-frphc_7577782b-d81a-428d-abba-4b2f85606b5a/kube-rbac-proxy/0.log" Apr 19 15:58:34.898521 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:34.898493 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9wldk_e251a3a4-2359-4dcf-90f8-20b9a43b9aa4/dns-node-resolver/0.log" Apr 19 15:58:35.420941 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:35.420904 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d6jcn_e1bc6add-6e54-4e69-ae6d-573e33e1dc7a/node-ca/0.log" Apr 19 15:58:36.293619 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:36.293588 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-j4624_e7720a6e-57d9-40f6-a6ec-ed93bcc9881e/discovery/0.log" Apr 19 15:58:36.333758 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:36.333733 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-745fb64bc5-kj7bm_95c31735-4f02-453a-97bb-32db7c2850d7/kube-auth-proxy/0.log" Apr 19 15:58:36.911445 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:36.911414 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lnp5x_2f59fe9a-35b7-4dae-ae52-22cef5a86c77/serve-healthcheck-canary/0.log" Apr 19 15:58:37.359165 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:37.359134 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9b56l_02e21f7d-06af-4ca8-b988-b015f529dbf7/kube-rbac-proxy/0.log" Apr 19 15:58:37.378128 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:37.378102 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9b56l_02e21f7d-06af-4ca8-b988-b015f529dbf7/exporter/0.log" Apr 19 15:58:37.397152 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:37.397127 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9b56l_02e21f7d-06af-4ca8-b988-b015f529dbf7/extractor/0.log" Apr 19 15:58:39.313383 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.313350 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-4dqqr_a165459e-29e1-4ee9-bec5-5463ea0f6f21/manager/0.log" Apr 19 15:58:39.343070 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.343042 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-66b65d79c8-d5zvh_2b3ec3b4-1395-4fec-87d6-78df803ff3f6/maas-api/0.log" Apr 19 15:58:39.409513 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.409487 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9vd7p_4395055e-f594-437e-8773-9fc927742701/manager/0.log" Apr 19 15:58:39.420405 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.420379 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9vd7p_4395055e-f594-437e-8773-9fc927742701/manager/1.log" Apr 19 15:58:39.500725 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.500694 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-67944f454b-h7kh2_5e7a403b-d82f-4088-93a8-a4cc60d4be4a/manager/0.log" Apr 19 15:58:39.518348 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:39.518323 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-44n94_f7fd1c88-5ca6-4183-927c-777305a31a89/postgres/0.log" Apr 19 15:58:40.331426 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:40.331390 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-tqpsr" Apr 19 15:58:46.380209 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.377739 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jvjq_a265fc76-7ac3-4ce6-b6a9-f1f988b751d5/kube-multus/0.log" Apr 19 15:58:46.399253 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.399230 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/kube-multus-additional-cni-plugins/0.log" Apr 19 15:58:46.419126 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.419086 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/egress-router-binary-copy/0.log" Apr 19 15:58:46.440317 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.440292 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/cni-plugins/0.log" Apr 19 15:58:46.458827 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.458802 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/bond-cni-plugin/0.log" Apr 19 15:58:46.477885 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.477862 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/routeoverride-cni/0.log" Apr 19 15:58:46.496296 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.496275 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/whereabouts-cni-bincopy/0.log" Apr 19 15:58:46.514363 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.514345 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57hmj_f4487f66-c637-425c-a304-b53a5a1d6b25/whereabouts-cni/0.log" Apr 19 15:58:46.881865 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.881836 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6gs28_4bc5acd5-37f1-4ca4-bb28-39e2f95194f9/network-metrics-daemon/0.log" Apr 19 15:58:46.900464 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:46.900443 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6gs28_4bc5acd5-37f1-4ca4-bb28-39e2f95194f9/kube-rbac-proxy/0.log" Apr 19 15:58:48.191226 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.191202 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/ovn-controller/0.log" Apr 19 15:58:48.217002 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.216962 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/ovn-acl-logging/0.log" Apr 19 15:58:48.235256 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.235211 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/kube-rbac-proxy-node/0.log" Apr 19 15:58:48.255002 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.254971 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 15:58:48.274207 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.274182 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/northd/0.log" Apr 19 15:58:48.293126 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.293102 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/nbdb/0.log" Apr 19 15:58:48.312433 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.312413 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/sbdb/0.log" Apr 19 15:58:48.406253 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:48.406228 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwk8b_cfa9f4ef-4f5b-4af5-84b7-bfeb2ca1baf3/ovnkube-controller/0.log" Apr 19 15:58:49.750222 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:49.750195 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-f7z6w_f3911b93-0872-4188-b092-01a8b4b8326d/check-endpoints/0.log" Apr 19 15:58:49.774611 ip-10-0-131-48 kubenswrapper[2564]: I0419 15:58:49.774584 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-g9dx5_07623157-4c19-4793-880d-b21867ce44f7/network-check-target-container/0.log"