Apr 16 23:47:15.581874 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 23:47:15.581883 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 23:47:15.581890 ip-10-0-133-231 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 23:47:15.582153 ip-10-0-133-231 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 23:47:26.731613 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 23:47:26.731632 ip-10-0-133-231 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 34b0ae2441004189bb5d9c8114eeeac8 -- Apr 16 23:50:15.338191 ip-10-0-133-231 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:50:15.775766 ip-10-0-133-231 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:15.775766 ip-10-0-133-231 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:50:15.775766 ip-10-0-133-231 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:15.775766 ip-10-0-133-231 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:50:15.775766 ip-10-0-133-231 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:15.777259 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.777176 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:50:15.780283 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780269 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:15.780283 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780282 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780287 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780290 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780293 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780295 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780298 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780301 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780305 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780309 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780311 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780314 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780316 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780319 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780322 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780324 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780327 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780329 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780332 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780335 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780337 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:15.780346 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780339 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780342 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780345 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780347 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780350 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780355 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780360 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780363 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780366 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780369 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780372 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780375 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780377 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780380 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780382 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780385 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780388 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780392 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780394 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:15.780820 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780397 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780400 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780402 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780405 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780407 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780410 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780412 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780415 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780418 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780420 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780423 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780425 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780428 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780430 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780433 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780437 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780439 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780443 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780446 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780449 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:15.781306 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780451 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780454 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780457 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780459 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780465 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780468 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780470 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780473 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780475 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780478 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780487 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780491 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780494 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780496 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780499 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780502 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780504 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780507 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780509 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:15.781792 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780512 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780514 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780517 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780519 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780522 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780525 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780904 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780911 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780929 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780932 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780936 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780939 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780941 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780944 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780947 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780950 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780952 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780955 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780958 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780960 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:15.782294 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780963 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780966 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780968 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780971 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780974 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780977 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780979 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780982 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780984 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780987 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780989 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780992 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780994 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780997 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.780999 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781001 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781004 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781007 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781009 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781012 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:15.782769 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781015 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781018 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781020 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781024 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781026 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781029 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781031 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781034 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781037 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781040 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781043 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781045 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781048 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781050 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781053 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781056 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781059 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781061 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781064 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:15.783282 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781066 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781069 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781072 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781074 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781077 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781080 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781083 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781085 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781088 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781090 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781093 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781097 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781101 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781105 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781108 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781111 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781114 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781116 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781119 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:15.783735 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781122 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781124 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781127 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781129 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781132 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781134 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781137 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781139 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781142 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781144 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781147 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781150 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781153 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.781155 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781869 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781877 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781883 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781888 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781892 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781896 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781900 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:50:15.784220 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781904 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781908 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781911 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781928 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781932 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781935 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781938 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781941 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781944 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781946 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781949 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781952 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781957 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781959 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781962 2576 flags.go:64] FLAG: --config-dir="" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781965 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781969 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781973 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781975 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781978 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781982 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781986 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781989 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781993 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781996 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:50:15.784827 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.781999 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782003 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782006 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782009 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782012 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782015 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782018 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782022 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782025 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782028 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782031 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782033 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782037 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782040 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782043 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782046 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782049 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782052 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782054 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782057 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782060 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782063 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782066 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782072 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782075 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:50:15.785467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782078 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782081 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782085 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782088 2576 flags.go:64] FLAG: --help="false" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782091 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-133-231.ec2.internal" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782095 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782098 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782101 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782104 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782108 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782111 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782113 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782116 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782119 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782122 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782125 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782128 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782131 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782133 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782137 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782139 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782142 2576 flags.go:64] FLAG: --lock-file="" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782145 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782148 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:50:15.786133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782151 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782156 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782159 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782162 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782166 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782169 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782173 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782176 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782179 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782183 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782186 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782192 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782195 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782198 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782201 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782204 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782207 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782210 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782213 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782220 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782223 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782225 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782228 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:50:15.786698 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782231 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782237 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782240 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782243 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782246 2576 flags.go:64] FLAG: --port="10250" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782249 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782251 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02e956c5d20837653" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782255 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782258 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782260 2576 flags.go:64] FLAG: --register-node="true" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782263 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782266 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782270 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782272 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782276 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782279 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782283 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782286 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782289 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782292 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782296 2576 flags.go:64] FLAG: --runonce="false" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782299 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782302 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782305 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782309 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782312 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:50:15.787313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782315 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782318 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782321 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782324 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782327 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782330 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782332 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782335 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782339 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782341 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782347 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782349 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782352 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782356 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782359 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782361 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782364 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782367 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782370 2576 flags.go:64] FLAG: --v="2" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782374 2576 flags.go:64] FLAG: --version="false" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782379 2576 flags.go:64] FLAG: --vmodule="" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782383 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.782387 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782480 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:15.788032 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782483 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782486 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782490 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782493 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782496 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782498 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782501 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782504 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782508 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782510 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782513 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782516 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782520 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782523 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782526 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782528 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782531 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782533 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782536 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:15.788643 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782538 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782541 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782543 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782546 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782548 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782551 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782553 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782556 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782559 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782562 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782565 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782568 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782570 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782574 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782576 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782581 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782583 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782586 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782588 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782591 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:15.789129 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782593 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782596 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782601 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782604 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782606 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782608 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782611 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782613 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782616 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782619 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782621 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782623 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782626 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782628 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782631 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782633 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782636 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782638 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782641 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782643 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:15.789632 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782646 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782650 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782652 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782655 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782658 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782660 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782663 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782667 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782670 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782672 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782675 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782678 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782680 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782684 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782686 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782689 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782691 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782694 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782698 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782701 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:15.790196 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782704 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782707 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782710 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782712 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782715 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.782717 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:15.790732 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.783302 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:15.791715 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.791695 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:50:15.791715 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.791714 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791762 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791770 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791775 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791780 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791783 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791786 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791789 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791792 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791795 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791798 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791801 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791803 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791806 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791809 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791811 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:15.791805 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791814 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791817 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791820 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791823 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791826 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791828 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791831 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791833 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791836 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791839 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791841 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791844 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791849 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791853 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791856 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791858 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791861 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791864 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791867 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:15.792240 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791869 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791872 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791874 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791877 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791879 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791881 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791884 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791886 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791889 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791891 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791894 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791897 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791899 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791902 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791904 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791908 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791911 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791929 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791932 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791935 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:15.792703 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791938 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791940 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791943 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791945 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791948 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791951 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791953 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791956 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791958 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791961 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791963 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791966 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791968 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791971 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791974 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791976 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791979 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791981 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791984 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791986 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:15.793210 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791989 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791993 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.791998 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792001 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792004 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792007 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792010 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792014 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792017 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792020 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792022 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792025 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.792030 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792129 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792134 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:15.793691 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792137 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792140 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792143 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792146 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792148 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792152 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792155 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792158 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792160 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792163 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792165 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792168 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792171 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792174 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792176 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792180 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792185 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792187 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792190 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:15.794195 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792193 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792196 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792198 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792201 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792203 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792207 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792209 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792212 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792215 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792217 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792220 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792223 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792225 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792228 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792231 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792233 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792235 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792238 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792241 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792243 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:15.794662 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792246 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792248 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792251 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792254 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792257 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792259 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792262 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792264 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792267 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792269 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792272 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792274 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792277 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792279 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792282 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792285 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792287 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792291 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792294 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792296 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:15.795161 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792299 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792301 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792304 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792306 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792309 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792311 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792314 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792317 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792320 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792323 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792326 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792330 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792332 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792335 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792338 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792341 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792344 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792346 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792349 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:15.795651 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792351 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792354 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792356 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792358 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792361 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:15.792363 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.792368 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:15.796139 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.793122 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:50:15.796637 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.796624 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:50:15.797635 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.797624 2576 server.go:1019] "Starting client certificate rotation" Apr 16 23:50:15.797734 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.797718 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:15.797769 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.797760 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:15.821210 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.821192 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:15.823551 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.823530 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:15.836747 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.836729 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:50:15.842509 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.842491 2576 log.go:25] "Validated CRI v1 image API" Apr 16 23:50:15.843705 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.843686 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:50:15.849812 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.849793 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:15.851038 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.851008 2576 fs.go:135] Filesystem UUIDs: map[50b937b8-bc74-4f3f-895d-15ca0854fae2:/dev/nvme0n1p4 60395717-50d4-4d99-8241-1a9f702d6da3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 23:50:15.851123 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.851038 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:50:15.856611 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.856502 2576 manager.go:217] Machine: {Timestamp:2026-04-16 23:50:15.854682465 +0000 UTC m=+0.400673050 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098329 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec225e1ef8e411ecd5c8495f34e4397a SystemUUID:ec225e1e-f8e4-11ec-d5c8-495f34e4397a BootID:34b0ae24-4100-4189-bb5d-9c8114eeeac8 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cb:69:b1:0a:39 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cb:69:b1:0a:39 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:53:d5:aa:bf:a9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:50:15.856611 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.856600 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:50:15.856771 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.856705 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:50:15.857846 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.857823 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:50:15.858031 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.857849 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:50:15.858121 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.858045 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:50:15.858121 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.858058 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:50:15.858121 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.858076 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:15.858971 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.858959 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:15.860155 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.860143 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:15.860285 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.860274 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:50:15.862657 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.862646 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:50:15.862713 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.862664 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:50:15.862713 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.862679 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:50:15.862713 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.862693 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:50:15.862713 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.862706 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:50:15.863862 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.863849 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:15.863954 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.863871 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:15.866643 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.866628 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:50:15.867834 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.867818 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:50:15.869554 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869543 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869560 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869566 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869572 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869580 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869587 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:50:15.869594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869594 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:50:15.869781 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869605 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:50:15.869781 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869612 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:50:15.869781 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869617 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:50:15.869781 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869629 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:50:15.869781 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.869638 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:50:15.870541 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.870524 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:50:15.870541 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.870541 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:50:15.873858 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.873842 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:50:15.873953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.873877 2576 server.go:1295] "Started kubelet" Apr 16 23:50:15.874028 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.873979 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:50:15.874061 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.874015 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:50:15.874089 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.874076 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:50:15.874659 ip-10-0-133-231 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:50:15.875334 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.875300 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:50:15.875558 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.875542 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:50:15.875779 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.875756 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:50:15.875887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.875868 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 23:50:15.875992 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.875958 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:50:15.880259 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.879101 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-231.ec2.internal.18a6fb4e180e5b41 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-231.ec2.internal,UID:ip-10-0-133-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-231.ec2.internal,},FirstTimestamp:2026-04-16 23:50:15.873854273 +0000 UTC m=+0.419844867,LastTimestamp:2026-04-16 23:50:15.873854273 +0000 UTC m=+0.419844867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-231.ec2.internal,}" Apr 16 23:50:15.880390 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.880377 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:50:15.881229 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881170 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:50:15.881229 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881223 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:50:15.881372 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881353 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:50:15.881605 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881583 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:50:15.881605 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881598 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:50:15.881713 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.880406 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:15.881767 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881714 2576 factory.go:55] Registering systemd factory Apr 16 23:50:15.881816 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.881788 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:50:15.881959 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.881942 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:15.882438 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882407 2576 factory.go:153] Registering CRI-O factory Apr 16 23:50:15.882438 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882429 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 23:50:15.882588 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882479 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:50:15.882588 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882515 2576 factory.go:103] Registering Raw factory Apr 16 23:50:15.882588 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882530 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 23:50:15.882976 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.882960 2576 manager.go:319] Starting recovery of all containers Apr 16 23:50:15.886161 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.886099 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 23:50:15.890575 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.890550 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xk575" Apr 16 23:50:15.895549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.895369 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 23:50:15.895649 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.895369 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 23:50:15.895649 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.895490 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xk575" Apr 16 23:50:15.896289 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.896272 2576 manager.go:324] Recovery completed Apr 16 23:50:15.900785 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.900772 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:15.902968 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.902951 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:15.903027 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.902990 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:15.903027 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.903004 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:15.903455 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.903440 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:50:15.903455 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.903454 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:50:15.903577 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.903472 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:15.905583 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.905565 2576 policy_none.go:49] "None policy: Start" Apr 16 23:50:15.905646 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.905592 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:50:15.905646 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.905631 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:50:15.939496 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.939482 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.939512 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.939522 2576 server.go:85] "Starting device plugin registration server" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.939766 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.939779 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.939884 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.940049 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.940058 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.940632 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:50:15.945953 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.940674 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:15.974668 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.974637 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:50:15.975999 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.975976 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:50:15.976082 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.976008 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:50:15.976082 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.976028 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:50:15.976082 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.976036 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:50:15.976082 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:15.976073 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:50:15.978760 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:15.978742 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:16.040867 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.040819 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:16.041586 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.041566 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:16.041676 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.041601 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:16.041676 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.041611 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:16.041676 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.041633 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.050718 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.050700 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.050797 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.050720 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-231.ec2.internal\": node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.067515 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.067494 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.076545 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.076523 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal"] Apr 16 23:50:16.076619 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.076593 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:16.077803 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.077786 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:16.077882 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.077815 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:16.077882 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.077826 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:16.078967 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.078953 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:16.079101 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.079159 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079119 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:16.079658 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079640 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:16.079720 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079667 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:16.079720 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079643 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:16.079720 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079691 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:16.079720 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079709 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:16.079859 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.079725 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:16.080636 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.080620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.080709 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.080644 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:16.081295 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.081279 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:16.081368 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.081310 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:16.081368 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.081324 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:16.082820 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.082800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.082911 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.082827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.082911 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.082870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.102596 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.102574 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-231.ec2.internal\" not found" node="ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.107054 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.107037 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-231.ec2.internal\" not found" node="ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.168167 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.168147 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.183465 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.183465 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.183596 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.183596 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.183596 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.183596 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.183579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.268590 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.268560 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.369374 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.369314 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.404805 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.404775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.409546 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.409530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:16.469904 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.469880 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.570374 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.570351 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.670900 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.670843 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.710728 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.710705 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:16.771681 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.771660 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.797198 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.797173 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:50:16.797662 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.797333 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:16.797662 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.797337 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:16.820652 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.820629 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:16.872542 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.872512 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.882428 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.882362 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:16.893441 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.893422 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:16.898139 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.898112 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:45:15 +0000 UTC" deadline="2027-12-22 05:38:46.340296079 +0000 UTC" Apr 16 23:50:16.898139 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.898138 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14741h48m29.442160666s" Apr 16 23:50:16.914439 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.914418 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vdl2c" Apr 16 23:50:16.921638 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.921600 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vdl2c" Apr 16 23:50:16.960117 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:16.960087 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23282ac0c4062a2415d09037c9c74b84.slice/crio-672e6444406092335418fd395f4e7dd0e77364dcdadebc85e8965fdcc83ec970 WatchSource:0}: Error finding container 672e6444406092335418fd395f4e7dd0e77364dcdadebc85e8965fdcc83ec970: Status 404 returned error can't find the container with id 672e6444406092335418fd395f4e7dd0e77364dcdadebc85e8965fdcc83ec970 Apr 16 23:50:16.960450 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:16.960431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eb73e8eaa1b833503296b19a264c17c.slice/crio-ec4a57863de64856753d4ee054238a0f0b42fc76464d24d81b7932c4ee517277 WatchSource:0}: Error finding container ec4a57863de64856753d4ee054238a0f0b42fc76464d24d81b7932c4ee517277: Status 404 returned error can't find the container with id ec4a57863de64856753d4ee054238a0f0b42fc76464d24d81b7932c4ee517277 Apr 16 23:50:16.965903 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.965880 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:50:16.973216 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:16.973199 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:16.979329 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.979290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" event={"ID":"1eb73e8eaa1b833503296b19a264c17c","Type":"ContainerStarted","Data":"ec4a57863de64856753d4ee054238a0f0b42fc76464d24d81b7932c4ee517277"} Apr 16 23:50:16.980413 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:16.980395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerStarted","Data":"672e6444406092335418fd395f4e7dd0e77364dcdadebc85e8965fdcc83ec970"} Apr 16 23:50:17.073958 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.073933 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:17.174409 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.174359 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:17.274876 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.274850 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:17.375635 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.375609 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 16 23:50:17.439436 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.439383 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:17.482024 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.481997 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 16 23:50:17.491392 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.491370 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:17.492327 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.492306 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 16 23:50:17.501700 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.501681 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:17.834595 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.834520 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:17.863756 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.863718 2576 apiserver.go:52] "Watching apiserver" Apr 16 23:50:17.871548 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.871522 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:50:17.874524 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.874491 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-68t8w","openshift-dns/node-resolver-747kz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal","openshift-multus/network-metrics-daemon-ktsl6","openshift-network-diagnostics/network-check-target-qgk96","openshift-network-operator/iptables-alerter-5xhvf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp","openshift-image-registry/node-ca-4ldzm","openshift-multus/multus-additional-cni-plugins-x5gfp","openshift-multus/multus-dtpng","openshift-ovn-kubernetes/ovnkube-node-btfdz","kube-system/konnectivity-agent-qcsbq","kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal"] Apr 16 23:50:17.877796 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.877774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.880103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.879822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.880103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.879825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.880103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.879857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.880103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.879950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:50:17.880103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.879964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zf2rz\"" Apr 16 23:50:17.882944 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.881606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gj2zl\"" Apr 16 23:50:17.882944 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.881706 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.882944 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.881843 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.884263 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.884243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:17.884378 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.884312 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:17.886646 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.886472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:17.886646 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.886564 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:17.888716 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.888686 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:17.890595 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.890577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.890975 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.890960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:50:17.891060 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.890961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.891114 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.890964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lnb8h\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891752 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jggl\" (UniqueName: \"kubernetes.io/projected/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-kube-api-access-4jggl\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-registration-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qjdk\" (UniqueName: \"kubernetes.io/projected/33e105a8-68bd-410a-93cc-61f702f80d3e-kube-api-access-7qjdk\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.891940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d193d29-372b-44a9-a007-2f9fd389e08e-hosts-file\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d193d29-372b-44a9-a007-2f9fd389e08e-tmp-dir\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tlq\" (UniqueName: \"kubernetes.io/projected/3d193d29-372b-44a9-a007-2f9fd389e08e-kube-api-access-f8tlq\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-socket-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-device-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.892819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-sys-fs\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.893496 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.894005 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pz5vg\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.894036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.894192 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.894888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.894420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9zcd5\"" Apr 16 23:50:17.895802 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.895171 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:50:17.895906 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.895838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.896244 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.896027 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.898120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.897956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fj65d\"" Apr 16 23:50:17.898444 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.898425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.899033 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:50:17.899130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899051 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.899210 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:50:17.899262 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899253 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.899416 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:50:17.899897 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.899877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:50:17.900316 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.900297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-46v7t\"" Apr 16 23:50:17.900968 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.900951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.902970 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.902898 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:50:17.903169 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tc9gp\"" Apr 16 23:50:17.903273 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:50:17.903334 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:50:17.903334 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:17.903495 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903482 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:50:17.903760 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903743 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:50:17.903895 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.903881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:50:17.905141 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.905124 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:50:17.905229 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.905172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:50:17.905373 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.905352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wtjp4\"" Apr 16 23:50:17.922565 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.922543 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:16 +0000 UTC" deadline="2028-01-27 12:49:08.084152622 +0000 UTC" Apr 16 23:50:17.922565 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.922565 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15612h58m50.161590351s" Apr 16 23:50:17.982254 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.982234 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:50:17.993479 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qjdk\" (UniqueName: \"kubernetes.io/projected/33e105a8-68bd-410a-93cc-61f702f80d3e-kube-api-access-7qjdk\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d193d29-372b-44a9-a007-2f9fd389e08e-hosts-file\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-agent-certs\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-modprobe-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-system-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-netns\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.993597 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d193d29-372b-44a9-a007-2f9fd389e08e-hosts-file\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tlq\" (UniqueName: \"kubernetes.io/projected/3d193d29-372b-44a9-a007-2f9fd389e08e-kube-api-access-f8tlq\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-konnectivity-ca\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysconfig\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-systemd\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-system-cni-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-etc-kubernetes\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-conf\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwrt\" (UniqueName: \"kubernetes.io/projected/63883006-764d-4455-b7c7-6289c17bdd27-kube-api-access-5kwrt\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-netns\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.993885 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-etc-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-cni-binary-copy\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-socket-dir-parent\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.993975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-systemd-units\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-log-socket\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-run\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-var-lib-kubelet\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-systemd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-ovn\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-socket-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-device-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-sys-fs\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.994458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-multus-certs\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-socket-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-device-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-slash\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-sys-fs\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-var-lib-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-kubernetes\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-sys\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-tuned\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.994946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.995313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkfx\" (UniqueName: \"kubernetes.io/projected/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-kube-api-access-lzkfx\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-netd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-script-lib\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-lib-modules\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-tmp\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-node-log\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:17.995644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-serviceca\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:17.995851 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.995729 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:17.995851 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6bw\" (UniqueName: \"kubernetes.io/projected/f39171be-e0ce-40eb-86b2-8d51c766008b-kube-api-access-js6bw\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.995910 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:17.995857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:18.495790131 +0000 UTC m=+3.041780708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:17.995985 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.995937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/069cfdd1-867c-43a9-bb51-9a876f085e59-iptables-alerter-script\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:17.996019 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-host\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:17.996051 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-hostroot\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996090 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-registration-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.996120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.996178 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-registration-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.996216 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:17.996249 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5tvk\" (UniqueName: \"kubernetes.io/projected/069cfdd1-867c-43a9-bb51-9a876f085e59-kube-api-access-f5tvk\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:17.996299 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvpfl\" (UniqueName: \"kubernetes.io/projected/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-kube-api-access-qvpfl\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:17.996370 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.996403 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-os-release\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996434 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d193d29-372b-44a9-a007-2f9fd389e08e-tmp-dir\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.996467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-k8s-cni-cncf-io\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996498 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-multus\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996553 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-conf-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996600 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.996647 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-bin\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.996707 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-config\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.996765 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/069cfdd1-867c-43a9-bb51-9a876f085e59-host-slash\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:17.996795 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-bin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.996855 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-env-overrides\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.996887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.996931 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-host\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.996981 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.996954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjks\" (UniqueName: \"kubernetes.io/projected/ae9169bb-ec87-4908-a951-5b40a1ba2267-kube-api-access-ntjks\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:17.997043 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-os-release\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.997043 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-cnibin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.997105 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f39171be-e0ce-40eb-86b2-8d51c766008b-ovn-node-metrics-cert\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.997138 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jggl\" (UniqueName: \"kubernetes.io/projected/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-kube-api-access-4jggl\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:17.997169 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cnibin\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:17.997199 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-kubelet\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.997237 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-multus-daemon-config\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:17.997269 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.997300 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-kubelet\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:17.997685 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d193d29-372b-44a9-a007-2f9fd389e08e-tmp-dir\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:17.997764 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.997670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e105a8-68bd-410a-93cc-61f702f80d3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:17.999090 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:17.999062 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:50:18.002886 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.002864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qjdk\" (UniqueName: \"kubernetes.io/projected/33e105a8-68bd-410a-93cc-61f702f80d3e-kube-api-access-7qjdk\") pod \"aws-ebs-csi-driver-node-rzrfp\" (UID: \"33e105a8-68bd-410a-93cc-61f702f80d3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:18.002992 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.002873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tlq\" (UniqueName: \"kubernetes.io/projected/3d193d29-372b-44a9-a007-2f9fd389e08e-kube-api-access-f8tlq\") pod \"node-resolver-747kz\" (UID: \"3d193d29-372b-44a9-a007-2f9fd389e08e\") " pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:18.012182 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.012160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jggl\" (UniqueName: \"kubernetes.io/projected/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-kube-api-access-4jggl\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:18.098967 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.098869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.098967 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.098910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-multus-certs\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.098967 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.098949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-slash\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.098973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-var-lib-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.098996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-kubernetes\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-multus-certs\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-sys\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-tuned\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-var-lib-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkfx\" (UniqueName: \"kubernetes.io/projected/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-kube-api-access-lzkfx\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-slash\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-netd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-sys\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099176 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-script-lib\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-kubernetes\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-lib-modules\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-tmp\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-node-log\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-serviceca\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js6bw\" (UniqueName: \"kubernetes.io/projected/f39171be-e0ce-40eb-86b2-8d51c766008b-kube-api-access-js6bw\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/069cfdd1-867c-43a9-bb51-9a876f085e59-iptables-alerter-script\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-host\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-hostroot\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5tvk\" (UniqueName: \"kubernetes.io/projected/069cfdd1-867c-43a9-bb51-9a876f085e59-kube-api-access-f5tvk\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvpfl\" (UniqueName: \"kubernetes.io/projected/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-kube-api-access-qvpfl\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-node-log\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-os-release\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.099736 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-netd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-lib-modules\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-hostroot\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/069cfdd1-867c-43a9-bb51-9a876f085e59-iptables-alerter-script\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.099580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-os-release\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-host\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-k8s-cni-cncf-io\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-k8s-cni-cncf-io\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100418 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-multus\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-conf-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-bin\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-conf-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-config\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/069cfdd1-867c-43a9-bb51-9a876f085e59-host-slash\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-serviceca\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-multus\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-cni-bin\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-bin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-env-overrides\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-host\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/069cfdd1-867c-43a9-bb51-9a876f085e59-host-slash\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjks\" (UniqueName: \"kubernetes.io/projected/ae9169bb-ec87-4908-a951-5b40a1ba2267-kube-api-access-ntjks\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-os-release\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-cnibin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.100844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f39171be-e0ce-40eb-86b2-8d51c766008b-ovn-node-metrics-cert\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-script-lib\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cnibin\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-kubelet\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-cni-bin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-multus-daemon-config\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-kubelet\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-agent-certs\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-modprobe-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.100994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-system-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-netns\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-konnectivity-ca\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-env-overrides\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysconfig\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-systemd\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-run-netns\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-system-cni-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-modprobe-d\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.101679 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-system-cni-dir\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f39171be-e0ce-40eb-86b2-8d51c766008b-ovnkube-config\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-etc-kubernetes\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-kubelet\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-conf\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysconfig\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwrt\" (UniqueName: \"kubernetes.io/projected/63883006-764d-4455-b7c7-6289c17bdd27-kube-api-access-5kwrt\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-netns\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-host\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-etc-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-etc-openvswitch\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-netns\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cnibin\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-host-var-lib-kubelet\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-cni-binary-copy\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-systemd\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-system-cni-dir\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.102420 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-socket-dir-parent\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-etc-kubernetes\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-multus-socket-dir-parent\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-systemd-units\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-log-socket\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-os-release\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-sysctl-conf\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-run\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-var-lib-kubelet\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-systemd-units\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-systemd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-ovn\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-log-socket\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63883006-764d-4455-b7c7-6289c17bdd27-cnibin\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-ovn\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-run\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae9169bb-ec87-4908-a951-5b40a1ba2267-var-lib-kubelet\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.101997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-run-systemd\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f39171be-e0ce-40eb-86b2-8d51c766008b-host-run-ovn-kubernetes\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-cni-binary-copy\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63883006-764d-4455-b7c7-6289c17bdd27-multus-daemon-config\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.102722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-konnectivity-ca\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.103120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-etc-tuned\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.103493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae9169bb-ec87-4908-a951-5b40a1ba2267-tmp\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.104248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f39171be-e0ce-40eb-86b2-8d51c766008b-ovn-node-metrics-cert\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.104902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.104299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71c22abd-5fe1-440f-8a1c-d7fd92526d8f-agent-certs\") pod \"konnectivity-agent-qcsbq\" (UID: \"71c22abd-5fe1-440f-8a1c-d7fd92526d8f\") " pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:18.106983 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.105286 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:18.106983 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.105305 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:18.106983 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.105317 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:18.106983 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.105377 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:18.605359388 +0000 UTC m=+3.151349965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:18.108274 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.108253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkfx\" (UniqueName: \"kubernetes.io/projected/4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc-kube-api-access-lzkfx\") pod \"multus-additional-cni-plugins-x5gfp\" (UID: \"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc\") " pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.108379 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.108281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5tvk\" (UniqueName: \"kubernetes.io/projected/069cfdd1-867c-43a9-bb51-9a876f085e59-kube-api-access-f5tvk\") pod \"iptables-alerter-5xhvf\" (UID: \"069cfdd1-867c-43a9-bb51-9a876f085e59\") " pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.108379 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.108351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvpfl\" (UniqueName: \"kubernetes.io/projected/a6c8c77b-23d8-444b-93d9-efd10d5f4f5b-kube-api-access-qvpfl\") pod \"node-ca-4ldzm\" (UID: \"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b\") " pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.108570 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.108544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6bw\" (UniqueName: \"kubernetes.io/projected/f39171be-e0ce-40eb-86b2-8d51c766008b-kube-api-access-js6bw\") pod \"ovnkube-node-btfdz\" (UID: \"f39171be-e0ce-40eb-86b2-8d51c766008b\") " pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.108984 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.108963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwrt\" (UniqueName: \"kubernetes.io/projected/63883006-764d-4455-b7c7-6289c17bdd27-kube-api-access-5kwrt\") pod \"multus-dtpng\" (UID: \"63883006-764d-4455-b7c7-6289c17bdd27\") " pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.109271 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.109250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjks\" (UniqueName: \"kubernetes.io/projected/ae9169bb-ec87-4908-a951-5b40a1ba2267-kube-api-access-ntjks\") pod \"tuned-68t8w\" (UID: \"ae9169bb-ec87-4908-a951-5b40a1ba2267\") " pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.190722 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.190694 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" Apr 16 23:50:18.199422 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.199398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-747kz" Apr 16 23:50:18.208082 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.208061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5xhvf" Apr 16 23:50:18.213675 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.213649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68t8w" Apr 16 23:50:18.220190 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.220172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4ldzm" Apr 16 23:50:18.227763 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.227743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" Apr 16 23:50:18.234393 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.234367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dtpng" Apr 16 23:50:18.240967 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.240947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:18.246503 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.246485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:18.505141 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.505060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:18.505297 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.505172 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:18.505297 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.505231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:19.505213445 +0000 UTC m=+4.051204011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:18.569745 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.569718 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:18.605427 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.605403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:18.605589 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.605528 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:18.605589 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.605545 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:18.605589 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.605556 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:18.605751 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.605602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:19.605589775 +0000 UTC m=+4.151580335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:18.729194 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.729166 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae9169bb_ec87_4908_a951_5b40a1ba2267.slice/crio-f06f78e18d40d5267a2e4baec86877c765d5309f43b9250d7140e5585790cb1b WatchSource:0}: Error finding container f06f78e18d40d5267a2e4baec86877c765d5309f43b9250d7140e5585790cb1b: Status 404 returned error can't find the container with id f06f78e18d40d5267a2e4baec86877c765d5309f43b9250d7140e5585790cb1b Apr 16 23:50:18.730893 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.730845 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6c8c77b_23d8_444b_93d9_efd10d5f4f5b.slice/crio-3dad3d36cf4175764c5f1d0d15d1c135bcd3d268abf2e454a33c112e07d5e8b6 WatchSource:0}: Error finding container 3dad3d36cf4175764c5f1d0d15d1c135bcd3d268abf2e454a33c112e07d5e8b6: Status 404 returned error can't find the container with id 3dad3d36cf4175764c5f1d0d15d1c135bcd3d268abf2e454a33c112e07d5e8b6 Apr 16 23:50:18.735824 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.735801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c22abd_5fe1_440f_8a1c_d7fd92526d8f.slice/crio-8d7b3c2c49eda0744c14dab0566edfa1569fe371cee533689e36f984b27dda0c WatchSource:0}: Error finding container 8d7b3c2c49eda0744c14dab0566edfa1569fe371cee533689e36f984b27dda0c: Status 404 returned error can't find the container with id 8d7b3c2c49eda0744c14dab0566edfa1569fe371cee533689e36f984b27dda0c Apr 16 23:50:18.736474 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.736455 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af2a1c3_1ef3_407b_b7ab_2cdd03b858cc.slice/crio-bad9c55c23e6151cd05f3acb7d16eea43228bd71300a2d64381bd3daf8570846 WatchSource:0}: Error finding container bad9c55c23e6151cd05f3acb7d16eea43228bd71300a2d64381bd3daf8570846: Status 404 returned error can't find the container with id bad9c55c23e6151cd05f3acb7d16eea43228bd71300a2d64381bd3daf8570846 Apr 16 23:50:18.737878 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.737839 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d193d29_372b_44a9_a007_2f9fd389e08e.slice/crio-dea48ee02238e9c9fe401cb21d8cc6954c1ce474ebdc3dc5e0f21227337682ff WatchSource:0}: Error finding container dea48ee02238e9c9fe401cb21d8cc6954c1ce474ebdc3dc5e0f21227337682ff: Status 404 returned error can't find the container with id dea48ee02238e9c9fe401cb21d8cc6954c1ce474ebdc3dc5e0f21227337682ff Apr 16 23:50:18.738940 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.738783 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63883006_764d_4455_b7c7_6289c17bdd27.slice/crio-233880aba30bc258b248ac60d7e095be76d7701095eef18044afab28ce25e65f WatchSource:0}: Error finding container 233880aba30bc258b248ac60d7e095be76d7701095eef18044afab28ce25e65f: Status 404 returned error can't find the container with id 233880aba30bc258b248ac60d7e095be76d7701095eef18044afab28ce25e65f Apr 16 23:50:18.740052 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.740031 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf39171be_e0ce_40eb_86b2_8d51c766008b.slice/crio-aeaa9b6535d1ca37a61fb27a6be917cdbf546867ef855bb41698eda7c26781d7 WatchSource:0}: Error finding container aeaa9b6535d1ca37a61fb27a6be917cdbf546867ef855bb41698eda7c26781d7: Status 404 returned error can't find the container with id aeaa9b6535d1ca37a61fb27a6be917cdbf546867ef855bb41698eda7c26781d7 Apr 16 23:50:18.740665 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.740644 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e105a8_68bd_410a_93cc_61f702f80d3e.slice/crio-2c44018770ca80e46acb42e2a476b9c8b4dca1e898600462215f128c31e84103 WatchSource:0}: Error finding container 2c44018770ca80e46acb42e2a476b9c8b4dca1e898600462215f128c31e84103: Status 404 returned error can't find the container with id 2c44018770ca80e46acb42e2a476b9c8b4dca1e898600462215f128c31e84103 Apr 16 23:50:18.742105 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:18.742081 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069cfdd1_867c_43a9_bb51_9a876f085e59.slice/crio-29d34649f61a9295c51a4a37efd355caaf25374f77e232669115566950ecf967 WatchSource:0}: Error finding container 29d34649f61a9295c51a4a37efd355caaf25374f77e232669115566950ecf967: Status 404 returned error can't find the container with id 29d34649f61a9295c51a4a37efd355caaf25374f77e232669115566950ecf967 Apr 16 23:50:18.923803 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.923608 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:16 +0000 UTC" deadline="2028-01-03 05:46:09.180226466 +0000 UTC" Apr 16 23:50:18.923803 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.923801 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15029h55m50.256429926s" Apr 16 23:50:18.977056 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.977029 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:18.977190 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:18.977151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:18.983911 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.983884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" event={"ID":"33e105a8-68bd-410a-93cc-61f702f80d3e","Type":"ContainerStarted","Data":"2c44018770ca80e46acb42e2a476b9c8b4dca1e898600462215f128c31e84103"} Apr 16 23:50:18.985875 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.985851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"aeaa9b6535d1ca37a61fb27a6be917cdbf546867ef855bb41698eda7c26781d7"} Apr 16 23:50:18.986818 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.986783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcsbq" event={"ID":"71c22abd-5fe1-440f-8a1c-d7fd92526d8f","Type":"ContainerStarted","Data":"8d7b3c2c49eda0744c14dab0566edfa1569fe371cee533689e36f984b27dda0c"} Apr 16 23:50:18.987626 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.987604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4ldzm" event={"ID":"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b","Type":"ContainerStarted","Data":"3dad3d36cf4175764c5f1d0d15d1c135bcd3d268abf2e454a33c112e07d5e8b6"} Apr 16 23:50:18.988754 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.988706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68t8w" event={"ID":"ae9169bb-ec87-4908-a951-5b40a1ba2267","Type":"ContainerStarted","Data":"f06f78e18d40d5267a2e4baec86877c765d5309f43b9250d7140e5585790cb1b"} Apr 16 23:50:18.990091 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.990072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" event={"ID":"1eb73e8eaa1b833503296b19a264c17c","Type":"ContainerStarted","Data":"7a960db5003b6741524bb6e39073d62281990cb1d591456ce47d9b74e61cac86"} Apr 16 23:50:18.991138 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.991116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5xhvf" event={"ID":"069cfdd1-867c-43a9-bb51-9a876f085e59","Type":"ContainerStarted","Data":"29d34649f61a9295c51a4a37efd355caaf25374f77e232669115566950ecf967"} Apr 16 23:50:18.992102 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.992085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerStarted","Data":"bad9c55c23e6151cd05f3acb7d16eea43228bd71300a2d64381bd3daf8570846"} Apr 16 23:50:18.993017 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.992995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dtpng" event={"ID":"63883006-764d-4455-b7c7-6289c17bdd27","Type":"ContainerStarted","Data":"233880aba30bc258b248ac60d7e095be76d7701095eef18044afab28ce25e65f"} Apr 16 23:50:18.993990 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:18.993968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-747kz" event={"ID":"3d193d29-372b-44a9-a007-2f9fd389e08e","Type":"ContainerStarted","Data":"dea48ee02238e9c9fe401cb21d8cc6954c1ce474ebdc3dc5e0f21227337682ff"} Apr 16 23:50:19.001146 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:19.001102 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" podStartSLOduration=2.001091466 podStartE2EDuration="2.001091466s" podCreationTimestamp="2026-04-16 23:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:19.000909711 +0000 UTC m=+3.546900313" watchObservedRunningTime="2026-04-16 23:50:19.001091466 +0000 UTC m=+3.547082049" Apr 16 23:50:19.511066 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:19.510985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:19.511234 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.511146 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:19.511304 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.511235 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:21.51121377 +0000 UTC m=+6.057204342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:19.611531 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:19.611493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:19.611693 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.611664 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:19.611693 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.611683 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:19.611816 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.611696 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:19.611816 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.611752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:21.611734463 +0000 UTC m=+6.157725027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:19.977639 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:19.977023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:19.977639 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:19.977166 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:20.007195 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.007084 2576 generic.go:358] "Generic (PLEG): container finished" podID="23282ac0c4062a2415d09037c9c74b84" containerID="6c6c3b3dec59b0463a03b4502070617835bb005f7a41aeaa8df24b66e3ef6bc3" exitCode=0 Apr 16 23:50:20.007195 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.007164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerDied","Data":"6c6c3b3dec59b0463a03b4502070617835bb005f7a41aeaa8df24b66e3ef6bc3"} Apr 16 23:50:20.014627 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.014603 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tnq5s"] Apr 16 23:50:20.017519 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.017498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.017618 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.017578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:20.115051 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.114998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-dbus\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.115211 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.115104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-kubelet-config\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.115211 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.115135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216161 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.216124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-kubelet-config\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216327 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.216181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216327 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.216227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-dbus\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216461 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.216442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-dbus\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216530 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.216515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c517f342-1f92-4fea-88bf-76e1e2f71358-kubelet-config\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.216626 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.216611 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:20.216678 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.216672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:20.716652709 +0000 UTC m=+5.262643268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:20.720490 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.720408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:20.720696 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.720598 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:20.720696 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.720654 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:21.720637327 +0000 UTC m=+6.266627890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:20.976743 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:20.976664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:20.976945 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:20.976794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:21.024168 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.023501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerStarted","Data":"335f6300664180db47b07451907d762ccca06a5b2bb1feed460b92d155308774"} Apr 16 23:50:21.035675 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.035631 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" podStartSLOduration=4.035618759 podStartE2EDuration="4.035618759s" podCreationTimestamp="2026-04-16 23:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:21.035286749 +0000 UTC m=+5.581277333" watchObservedRunningTime="2026-04-16 23:50:21.035618759 +0000 UTC m=+5.581609342" Apr 16 23:50:21.527843 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.527237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:21.527843 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.527398 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:21.527843 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.527460 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:25.52744032 +0000 UTC m=+10.073430883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:21.629042 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.628349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:21.629042 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.628525 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:21.629042 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.628544 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:21.629042 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.628557 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:21.629042 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.628644 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:25.628625014 +0000 UTC m=+10.174615598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:21.728926 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.728883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:21.729113 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.729048 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:21.729179 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.729116 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:23.729096182 +0000 UTC m=+8.275086752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:21.978329 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.978258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:21.978504 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.978400 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:21.978842 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:21.978820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:21.979084 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:21.979061 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:22.976697 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:22.976663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:22.977163 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:22.976789 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:23.746612 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:23.746562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:23.746781 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:23.746719 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:23.746850 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:23.746784 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:27.746766634 +0000 UTC m=+12.292757197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:23.977745 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:23.977076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:23.977745 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:23.977219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:23.977745 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:23.977602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:23.977745 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:23.977696 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:24.977054 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:24.976607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:24.977054 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:24.976727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:25.560348 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:25.560269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:25.560755 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.560439 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:25.560755 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.560513 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:33.560493183 +0000 UTC m=+18.106483748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:25.662819 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:25.661619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:25.662819 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.661804 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:25.662819 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.661832 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:25.662819 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.661844 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:25.662819 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.661905 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:33.661886283 +0000 UTC m=+18.207876866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:25.977313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:25.977234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:25.977465 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.977344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:25.978016 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:25.977839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:25.978016 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:25.977966 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:26.976334 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:26.976300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:26.976785 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:26.976420 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:27.777890 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:27.777856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:27.778097 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:27.778040 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:27.778165 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:27.778113 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:35.778093022 +0000 UTC m=+20.324083585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:27.977091 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:27.977058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:27.977572 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:27.977058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:27.977572 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:27.977194 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:27.977572 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:27.977258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:28.976644 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:28.976610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:28.976795 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:28.976724 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:29.976778 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:29.976737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:29.977228 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:29.976869 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:29.977228 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:29.976949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:29.977228 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:29.977076 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:30.976571 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:30.976544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:30.976738 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:30.976660 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:31.977067 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:31.977032 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:31.977489 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:31.977156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:31.977692 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:31.977663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:31.977810 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:31.977788 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:32.977081 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:32.977046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:32.977485 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:32.977178 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:33.621662 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:33.621625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:33.621844 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.621760 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:33.621844 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.621835 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:49.621816243 +0000 UTC m=+34.167806826 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:33.722434 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:33.722386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:33.722588 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.722565 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:33.722588 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.722584 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:33.722667 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.722595 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:33.722667 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.722655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:49.722640921 +0000 UTC m=+34.268631481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:33.976288 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:33.976212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:33.976288 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:33.976241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:33.976462 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.976322 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:33.976462 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:33.976436 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:34.976461 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:34.976427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:34.976896 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:34.976543 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:35.837243 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:35.837217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:35.837369 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:35.837337 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:35.837411 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:35.837381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret podName:c517f342-1f92-4fea-88bf-76e1e2f71358 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:51.83736768 +0000 UTC m=+36.383358240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret") pod "global-pull-secret-syncer-tnq5s" (UID: "c517f342-1f92-4fea-88bf-76e1e2f71358") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:35.977673 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:35.977153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:35.977673 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:35.977256 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:35.977673 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:35.977307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:35.977673 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:35.977383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:36.977029 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:36.976673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:36.977160 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:36.977128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:37.059461 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.059432 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="ae520643df7c4114085f643553766fbcfa21e8f1cb772323ff95bccc52835345" exitCode=0 Apr 16 23:50:37.060213 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.059508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"ae520643df7c4114085f643553766fbcfa21e8f1cb772323ff95bccc52835345"} Apr 16 23:50:37.060935 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.060891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dtpng" event={"ID":"63883006-764d-4455-b7c7-6289c17bdd27","Type":"ContainerStarted","Data":"6ea9942a6ee98be6128fd531db9b7e8e37cbbd0ef35bb165445f9a6041b4381e"} Apr 16 23:50:37.062263 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.062235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-747kz" event={"ID":"3d193d29-372b-44a9-a007-2f9fd389e08e","Type":"ContainerStarted","Data":"452491362bffd5fea3aacef6cb6475c915b305d9cc67df51485db013a956e5b3"} Apr 16 23:50:37.063581 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.063541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" event={"ID":"33e105a8-68bd-410a-93cc-61f702f80d3e","Type":"ContainerStarted","Data":"c039f82594eeabbe07c47c1531356912840c9b434a7f15c8bd81c359233f37cc"} Apr 16 23:50:37.065738 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.065719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:50:37.066040 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066021 2576 generic.go:358] "Generic (PLEG): container finished" podID="f39171be-e0ce-40eb-86b2-8d51c766008b" containerID="719c594af347d12811a13b17ec8304aabee67984ab5d8cfcb3a82704190bf9d3" exitCode=1 Apr 16 23:50:37.066130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"e11359785adad9a704914978611c7843230578eb7658da681c1793d0d996b33f"} Apr 16 23:50:37.066130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"03631699930c5988efad81d631131227738c28afb2136c4707a9c90fe3c313c2"} Apr 16 23:50:37.066130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"d48732bc7d270bd1a6ec6729326132adf9e33f6d2debb3a0b69b62106fb9dfc4"} Apr 16 23:50:37.066130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"3b0adaa9824244a2e52624b2f4be9da35f8f92dca8d605cdb6afa392ee7abc1c"} Apr 16 23:50:37.066130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerDied","Data":"719c594af347d12811a13b17ec8304aabee67984ab5d8cfcb3a82704190bf9d3"} Apr 16 23:50:37.066385 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.066139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"c7e8349cbbd1557df641d56519edef7f03a3b7bef39ac94df436f201f89f2e0a"} Apr 16 23:50:37.067336 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.067304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcsbq" event={"ID":"71c22abd-5fe1-440f-8a1c-d7fd92526d8f","Type":"ContainerStarted","Data":"36da0ed899c8d195dfdbaea983375760a22e6470758e11e5267b4cf501be44c6"} Apr 16 23:50:37.068533 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.068514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4ldzm" event={"ID":"a6c8c77b-23d8-444b-93d9-efd10d5f4f5b","Type":"ContainerStarted","Data":"aa1a8e6d3a7f77f66a4b0a888927716d2a33e35e353833f56c3d0ed6a827960a"} Apr 16 23:50:37.069673 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.069654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68t8w" event={"ID":"ae9169bb-ec87-4908-a951-5b40a1ba2267","Type":"ContainerStarted","Data":"23cf539bf8e3e2177b532383ed320772fc1a4ddddf809fe217d9c40a3e2be1fe"} Apr 16 23:50:37.103855 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.103820 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-68t8w" podStartSLOduration=3.915597882 podStartE2EDuration="21.103809267s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.731130709 +0000 UTC m=+3.277121269" lastFinishedPulling="2026-04-16 23:50:35.919342088 +0000 UTC m=+20.465332654" observedRunningTime="2026-04-16 23:50:37.091903326 +0000 UTC m=+21.637893908" watchObservedRunningTime="2026-04-16 23:50:37.103809267 +0000 UTC m=+21.649799831" Apr 16 23:50:37.104219 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.104196 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dtpng" podStartSLOduration=3.878535195 podStartE2EDuration="21.104191547s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.740534438 +0000 UTC m=+3.286524998" lastFinishedPulling="2026-04-16 23:50:35.966190789 +0000 UTC m=+20.512181350" observedRunningTime="2026-04-16 23:50:37.103756648 +0000 UTC m=+21.649747224" watchObservedRunningTime="2026-04-16 23:50:37.104191547 +0000 UTC m=+21.650182129" Apr 16 23:50:37.118682 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.118651 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qcsbq" podStartSLOduration=8.461424976 podStartE2EDuration="21.118640989s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.737458793 +0000 UTC m=+3.283449355" lastFinishedPulling="2026-04-16 23:50:31.394674805 +0000 UTC m=+15.940665368" observedRunningTime="2026-04-16 23:50:37.118453724 +0000 UTC m=+21.664444306" watchObservedRunningTime="2026-04-16 23:50:37.118640989 +0000 UTC m=+21.664631568" Apr 16 23:50:37.151761 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.151730 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-747kz" podStartSLOduration=3.961338009 podStartE2EDuration="21.151719746s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.739403111 +0000 UTC m=+3.285393671" lastFinishedPulling="2026-04-16 23:50:35.929784842 +0000 UTC m=+20.475775408" observedRunningTime="2026-04-16 23:50:37.140060775 +0000 UTC m=+21.686051359" watchObservedRunningTime="2026-04-16 23:50:37.151719746 +0000 UTC m=+21.697710385" Apr 16 23:50:37.151969 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.151947 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4ldzm" podStartSLOduration=3.965935861 podStartE2EDuration="21.151942036s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.733297509 +0000 UTC m=+3.279288071" lastFinishedPulling="2026-04-16 23:50:35.919303671 +0000 UTC m=+20.465294246" observedRunningTime="2026-04-16 23:50:37.15171805 +0000 UTC m=+21.697708633" watchObservedRunningTime="2026-04-16 23:50:37.151942036 +0000 UTC m=+21.697932617" Apr 16 23:50:37.282452 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.282429 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:50:37.950274 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.950161 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:50:37.282448398Z","UUID":"bf59a889-6a85-4b7b-80e9-b9a2cbc2074a","Handler":null,"Name":"","Endpoint":""} Apr 16 23:50:37.953463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.953441 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:50:37.953617 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.953475 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:50:37.976875 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.976845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:37.977036 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:37.976995 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:37.977102 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:37.977055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:37.977209 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:37.977187 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:38.074013 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:38.073980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" event={"ID":"33e105a8-68bd-410a-93cc-61f702f80d3e","Type":"ContainerStarted","Data":"24266824af6268b8237169e76e35dfbaaa1f00eff60cac1ea1ab8033f6803af9"} Apr 16 23:50:38.075540 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:38.075339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5xhvf" event={"ID":"069cfdd1-867c-43a9-bb51-9a876f085e59","Type":"ContainerStarted","Data":"6988cba7155b6baeac982035828fee55e57f2ec4906303351dd54589c0944296"} Apr 16 23:50:38.976340 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:38.976317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:38.976475 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:38.976412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:39.081178 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.080971 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:50:39.081667 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.081637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"7e6edb387d88f96e21a192a4bd3ac20f0a709f46d23d42b8578c93f6f8943ac3"} Apr 16 23:50:39.083726 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.083700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" event={"ID":"33e105a8-68bd-410a-93cc-61f702f80d3e","Type":"ContainerStarted","Data":"df00eb5b603e0b40b4b762aa832fd7fa70025a69d497d5baefd50fa1645c2b5c"} Apr 16 23:50:39.099504 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.099462 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5xhvf" podStartSLOduration=6.098791222 podStartE2EDuration="23.099449692s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.743967396 +0000 UTC m=+3.289957956" lastFinishedPulling="2026-04-16 23:50:35.74462585 +0000 UTC m=+20.290616426" observedRunningTime="2026-04-16 23:50:38.102465765 +0000 UTC m=+22.648456344" watchObservedRunningTime="2026-04-16 23:50:39.099449692 +0000 UTC m=+23.645440300" Apr 16 23:50:39.781716 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.781685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:39.976363 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.976332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:39.976533 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:39.976459 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:39.976533 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:39.976513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:39.976651 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:39.976598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:40.495899 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:40.495856 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:40.496473 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:40.496454 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:40.521694 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:40.521644 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzrfp" podStartSLOduration=4.65920993 podStartE2EDuration="24.521628551s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.743333525 +0000 UTC m=+3.289324085" lastFinishedPulling="2026-04-16 23:50:38.605752147 +0000 UTC m=+23.151742706" observedRunningTime="2026-04-16 23:50:39.099653682 +0000 UTC m=+23.645644265" watchObservedRunningTime="2026-04-16 23:50:40.521628551 +0000 UTC m=+25.067619136" Apr 16 23:50:40.976505 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:40.976478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:40.976686 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:40.976584 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:41.089266 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:41.089236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qcsbq" Apr 16 23:50:41.976704 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:41.976673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:41.977200 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:41.976788 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:41.977200 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:41.976891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:41.977200 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:41.977017 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:42.976598 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:42.976432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:42.976772 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:42.976679 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:43.092881 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.092847 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="aeb8ec216d038564279f9592e312045571ec2415df4be9a257211e9178670f90" exitCode=0 Apr 16 23:50:43.093009 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.092950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"aeb8ec216d038564279f9592e312045571ec2415df4be9a257211e9178670f90"} Apr 16 23:50:43.098244 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:50:43.098556 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"26811200c1d004c3957613a4ef43e5ef53d8c18ce115fba1fa528bc8d91abf7e"} Apr 16 23:50:43.098849 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:43.098849 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:43.099025 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:43.099025 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.098969 2576 scope.go:117] "RemoveContainer" containerID="719c594af347d12811a13b17ec8304aabee67984ab5d8cfcb3a82704190bf9d3" Apr 16 23:50:43.114379 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.114359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:43.114460 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.114447 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:50:43.977131 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.977108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:43.977532 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:43.977136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:43.977532 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:43.977263 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:43.977682 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:43.977658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:44.083887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.083517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qgk96"] Apr 16 23:50:44.083887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.083687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:44.083887 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:44.083816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:44.084844 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.084822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tnq5s"] Apr 16 23:50:44.085497 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.085473 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktsl6"] Apr 16 23:50:44.102051 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.102021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerStarted","Data":"23ed68079567b953be540fd0dbd4fb806f6cb3775874a70e0742ade689f3a14f"} Apr 16 23:50:44.105888 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.105870 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:50:44.106243 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.106227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:44.106352 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:44.106332 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:44.106540 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.106519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" event={"ID":"f39171be-e0ce-40eb-86b2-8d51c766008b","Type":"ContainerStarted","Data":"4e062a02fe76625435b364c485e76dc132c27526256fe3f2d7df1aa21bcb6ada"} Apr 16 23:50:44.106628 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.106597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:44.106675 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:44.106661 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:44.147039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:44.146993 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" podStartSLOduration=10.89817691 podStartE2EDuration="28.146980574s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.742969739 +0000 UTC m=+3.288960305" lastFinishedPulling="2026-04-16 23:50:35.991773394 +0000 UTC m=+20.537763969" observedRunningTime="2026-04-16 23:50:44.146710751 +0000 UTC m=+28.692701333" watchObservedRunningTime="2026-04-16 23:50:44.146980574 +0000 UTC m=+28.692971151" Apr 16 23:50:45.109887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.109852 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="23ed68079567b953be540fd0dbd4fb806f6cb3775874a70e0742ade689f3a14f" exitCode=0 Apr 16 23:50:45.109887 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.109884 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="f186143f4b12709fc1fd48cfbef9e71481b67c931adddfa2780b77fd071300e4" exitCode=0 Apr 16 23:50:45.110315 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.109889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"23ed68079567b953be540fd0dbd4fb806f6cb3775874a70e0742ade689f3a14f"} Apr 16 23:50:45.110315 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.109957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"f186143f4b12709fc1fd48cfbef9e71481b67c931adddfa2780b77fd071300e4"} Apr 16 23:50:45.979970 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.979657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:45.980114 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.979772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:45.980258 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:45.980228 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:45.980258 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:45.980229 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:45.980347 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:45.979796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:45.980434 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:45.980417 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:47.976800 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:47.976766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:47.977368 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:47.976764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:47.977368 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:47.976912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:47.977368 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:47.976904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:50:47.977368 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:47.977046 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qgk96" podUID="74530a0f-97e2-4c5b-a142-39f048b22670" Apr 16 23:50:47.977368 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:47.977136 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tnq5s" podUID="c517f342-1f92-4fea-88bf-76e1e2f71358" Apr 16 23:50:49.265953 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.265861 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeReady" Apr 16 23:50:49.266517 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.266031 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:50:49.298655 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.298620 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj"] Apr 16 23:50:49.303980 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.303954 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v"] Apr 16 23:50:49.304931 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.304397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.307130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.307105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 23:50:49.307251 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.307134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-74lf7\"" Apr 16 23:50:49.307312 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.307239 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 23:50:49.310473 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.308229 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 23:50:49.310816 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.310795 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:50:49.310994 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.310974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.313591 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.312452 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 23:50:49.314188 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.314171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 23:50:49.316247 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.316210 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb"] Apr 16 23:50:49.316352 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.316311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.319164 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:50:49.319383 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:50:49.319458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:50:49.319458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319226 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nf568\"" Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj"] Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v"] Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb"] Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319679 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.319972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.319826 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k8zp4"] Apr 16 23:50:49.321754 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.321737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 23:50:49.321938 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.321740 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 23:50:49.322015 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.321808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 23:50:49.322355 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.322335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 23:50:49.323531 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.323509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.324389 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.324327 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:50:49.325511 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.325484 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:50:49.325511 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.325502 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:50:49.325680 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.325565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:50:49.325743 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.325735 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:50:49.332206 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.332186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k8zp4"] Apr 16 23:50:49.414934 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.414886 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fmhdc"] Apr 16 23:50:49.418769 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.418737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.420568 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.420545 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:50:49.420681 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.420629 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:50:49.420795 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.420781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:50:49.425153 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.425130 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fmhdc"] Apr 16 23:50:49.445433 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b88722-fb4c-4375-8c87-0f9c5b719af9-tmp\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.445557 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c0b88722-fb4c-4375-8c87-0f9c5b719af9-klusterlet-config\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.445557 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1774876-46b7-47d7-802c-a56cb85674a5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.445557 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dcx\" (UniqueName: \"kubernetes.io/projected/c0b88722-fb4c-4375-8c87-0f9c5b719af9-kube-api-access-r2dcx\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnm2g\" (UniqueName: \"kubernetes.io/projected/bd90a043-dac9-4e2d-96a1-4eabae55281f-kube-api-access-bnm2g\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.445717 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c5c1538-0dba-4c64-8be2-7925c1e9905a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4b26\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpnr\" (UniqueName: \"kubernetes.io/projected/f1774876-46b7-47d7-802c-a56cb85674a5-kube-api-access-5mpnr\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkhq7\" (UniqueName: \"kubernetes.io/projected/1c5c1538-0dba-4c64-8be2-7925c1e9905a-kube-api-access-rkhq7\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.445997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.446039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.446017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.546857 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.546780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1774876-46b7-47d7-802c-a56cb85674a5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.546857 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.546820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.546857 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.546842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.547120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2dcx\" (UniqueName: \"kubernetes.io/projected/c0b88722-fb4c-4375-8c87-0f9c5b719af9-kube-api-access-r2dcx\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.547120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547120 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9dsw\" (UniqueName: \"kubernetes.io/projected/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-kube-api-access-w9dsw\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnm2g\" (UniqueName: \"kubernetes.io/projected/bd90a043-dac9-4e2d-96a1-4eabae55281f-kube-api-access-bnm2g\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c5c1538-0dba-4c64-8be2-7925c1e9905a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547322 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4b26\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547582 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547582 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpnr\" (UniqueName: \"kubernetes.io/projected/f1774876-46b7-47d7-802c-a56cb85674a5-kube-api-access-5mpnr\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.547582 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkhq7\" (UniqueName: \"kubernetes.io/projected/1c5c1538-0dba-4c64-8be2-7925c1e9905a-kube-api-access-rkhq7\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.547582 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547701 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.547626 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:49.547701 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.547701 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1774876-46b7-47d7-802c-a56cb85674a5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.547787 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.547735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:50:50.047706567 +0000 UTC m=+34.593697132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:50:49.547974 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.548055 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.548055 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.547982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.548139 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.548106 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:50:49.548180 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.548146 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:50:49.548232 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.548213 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:50.048195294 +0000 UTC m=+34.594185869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:50:49.548301 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.548349 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b88722-fb4c-4375-8c87-0f9c5b719af9-tmp\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.548964 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-tmp-dir\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.548964 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c0b88722-fb4c-4375-8c87-0f9c5b719af9-klusterlet-config\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.548964 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-config-volume\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.548964 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.548547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.549218 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.549178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.549445 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.549422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b88722-fb4c-4375-8c87-0f9c5b719af9-tmp\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.552677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.552473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.552677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.552514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.552677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.552567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-ca\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.552677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.552600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.552677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.552639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.553492 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.553471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1774876-46b7-47d7-802c-a56cb85674a5-hub\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.554050 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.554015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c5c1538-0dba-4c64-8be2-7925c1e9905a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.554600 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.554578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c0b88722-fb4c-4375-8c87-0f9c5b719af9-klusterlet-config\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.559316 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.559290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2dcx\" (UniqueName: \"kubernetes.io/projected/c0b88722-fb4c-4375-8c87-0f9c5b719af9-kube-api-access-r2dcx\") pod \"klusterlet-addon-workmgr-b484578cc-xjc9v\" (UID: \"c0b88722-fb4c-4375-8c87-0f9c5b719af9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.560000 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.559976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnm2g\" (UniqueName: \"kubernetes.io/projected/bd90a043-dac9-4e2d-96a1-4eabae55281f-kube-api-access-bnm2g\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:49.560130 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.560024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4b26\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.560303 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.560285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:49.561160 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.561108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpnr\" (UniqueName: \"kubernetes.io/projected/f1774876-46b7-47d7-802c-a56cb85674a5-kube-api-access-5mpnr\") pod \"cluster-proxy-proxy-agent-57bc76658b-fz7jb\" (UID: \"f1774876-46b7-47d7-802c-a56cb85674a5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.561836 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.561815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkhq7\" (UniqueName: \"kubernetes.io/projected/1c5c1538-0dba-4c64-8be2-7925c1e9905a-kube-api-access-rkhq7\") pod \"managed-serviceaccount-addon-agent-7695465d5f-ddqwj\" (UID: \"1c5c1538-0dba-4c64-8be2-7925c1e9905a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.632416 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.632382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" Apr 16 23:50:49.640275 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.640247 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:50:49.649347 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-config-volume\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.649472 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.649472 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:49.649472 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9dsw\" (UniqueName: \"kubernetes.io/projected/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-kube-api-access-w9dsw\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.649608 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.649474 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:49.649608 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-tmp-dir\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.649608 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.649594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:50:50.149575508 +0000 UTC m=+34.695566070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:50:49.649714 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.649658 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:49.649714 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.649692 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:21.649681676 +0000 UTC m=+66.195672236 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:49.649873 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.649856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-config-volume\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.650039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.650017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-tmp-dir\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.654313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.654285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:50:49.662947 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.662930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9dsw\" (UniqueName: \"kubernetes.io/projected/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-kube-api-access-w9dsw\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:49.750270 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.750229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:49.750421 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.750391 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:49.750421 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.750408 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:49.750421 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.750417 2576 projected.go:194] Error preparing data for projected volume kube-api-access-clp8z for pod openshift-network-diagnostics/network-check-target-qgk96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:49.750556 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:49.750468 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z podName:74530a0f-97e2-4c5b-a142-39f048b22670 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:21.750453211 +0000 UTC m=+66.296443774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-clp8z" (UniqueName: "kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z") pod "network-check-target-qgk96" (UID: "74530a0f-97e2-4c5b-a142-39f048b22670") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:49.979906 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.979875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:49.980113 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.979883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:50:49.980113 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.979883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:50:49.982586 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.982562 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:50:49.982859 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.982838 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:50:49.982859 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.982853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:50:49.983165 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.983059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:50:49.983165 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.983158 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:50:49.983331 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:49.983196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:50:50.053164 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:50.053135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:50.053293 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:50.053189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:50.053293 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.053281 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:50.053408 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.053290 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:50:50.053408 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.053306 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:50:50.053408 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.053368 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:50:51.053345073 +0000 UTC m=+35.599335635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:50:50.053408 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.053390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:51.053379761 +0000 UTC m=+35.599370325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:50:50.154289 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:50.154258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:50.154459 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.154439 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:50.154529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:50.154517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:50:51.154496034 +0000 UTC m=+35.700486598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:50:51.063050 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.063017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.063085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.063171 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.063235 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.063249 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.063233615 +0000 UTC m=+37.609224175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.063251 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:50:51.063638 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.063294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.063280611 +0000 UTC m=+37.609271171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:50:51.164450 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.164413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:51.164589 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.164539 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:51.164633 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:51.164612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.164592393 +0000 UTC m=+37.710582976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:50:51.307439 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.307220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb"] Apr 16 23:50:51.308034 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.308013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v"] Apr 16 23:50:51.309952 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.309929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj"] Apr 16 23:50:51.368417 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:51.368325 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b88722_fb4c_4375_8c87_0f9c5b719af9.slice/crio-1cb61e85a9a2d7adfc72e395f74d6860df8339379378a3253984125528c4edbc WatchSource:0}: Error finding container 1cb61e85a9a2d7adfc72e395f74d6860df8339379378a3253984125528c4edbc: Status 404 returned error can't find the container with id 1cb61e85a9a2d7adfc72e395f74d6860df8339379378a3253984125528c4edbc Apr 16 23:50:51.369054 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:51.368951 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1774876_46b7_47d7_802c_a56cb85674a5.slice/crio-970d966477c565fcad7c8f960876dbb5f9a352a587c66c1119310fe2b03ec496 WatchSource:0}: Error finding container 970d966477c565fcad7c8f960876dbb5f9a352a587c66c1119310fe2b03ec496: Status 404 returned error can't find the container with id 970d966477c565fcad7c8f960876dbb5f9a352a587c66c1119310fe2b03ec496 Apr 16 23:50:51.369538 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:51.369515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5c1538_0dba_4c64_8be2_7925c1e9905a.slice/crio-de484fc27564a2bc5a46cef92c48350463c94e62097bd2e2164a69bc89e75479 WatchSource:0}: Error finding container de484fc27564a2bc5a46cef92c48350463c94e62097bd2e2164a69bc89e75479: Status 404 returned error can't find the container with id de484fc27564a2bc5a46cef92c48350463c94e62097bd2e2164a69bc89e75479 Apr 16 23:50:51.871084 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.871046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:51.874508 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:51.874447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c517f342-1f92-4fea-88bf-76e1e2f71358-original-pull-secret\") pod \"global-pull-secret-syncer-tnq5s\" (UID: \"c517f342-1f92-4fea-88bf-76e1e2f71358\") " pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:52.093051 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.093018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tnq5s" Apr 16 23:50:52.129292 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.129056 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="161bfdd5db6911927dda9cdb0652f5768ba4d2f18cf7cfe932d84d5ddb10172f" exitCode=0 Apr 16 23:50:52.129292 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.129143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"161bfdd5db6911927dda9cdb0652f5768ba4d2f18cf7cfe932d84d5ddb10172f"} Apr 16 23:50:52.138233 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.138197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" event={"ID":"1c5c1538-0dba-4c64-8be2-7925c1e9905a","Type":"ContainerStarted","Data":"de484fc27564a2bc5a46cef92c48350463c94e62097bd2e2164a69bc89e75479"} Apr 16 23:50:52.141338 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.141264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" event={"ID":"c0b88722-fb4c-4375-8c87-0f9c5b719af9","Type":"ContainerStarted","Data":"1cb61e85a9a2d7adfc72e395f74d6860df8339379378a3253984125528c4edbc"} Apr 16 23:50:52.142658 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.142615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerStarted","Data":"970d966477c565fcad7c8f960876dbb5f9a352a587c66c1119310fe2b03ec496"} Apr 16 23:50:52.296289 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:52.295197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tnq5s"] Apr 16 23:50:52.310787 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:50:52.310751 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc517f342_1f92_4fea_88bf_76e1e2f71358.slice/crio-b9f16b5d83d049d06d6c294e48048e12228f47afabca8cd14d48a8fb6cc2fcfd WatchSource:0}: Error finding container b9f16b5d83d049d06d6c294e48048e12228f47afabca8cd14d48a8fb6cc2fcfd: Status 404 returned error can't find the container with id b9f16b5d83d049d06d6c294e48048e12228f47afabca8cd14d48a8fb6cc2fcfd Apr 16 23:50:53.081740 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.081492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:53.081985 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.081849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:53.082089 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.082000 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:53.082089 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.082063 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:50:57.082042716 +0000 UTC m=+41.628033282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:50:53.082486 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.082464 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:50:53.082486 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.082488 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:50:53.082640 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.082533 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:57.082517932 +0000 UTC m=+41.628508495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:50:53.162320 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.161304 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc" containerID="f3e1c1bf1a01f314ab174a0365a3246011cbef59af069904da466c4c9748af54" exitCode=0 Apr 16 23:50:53.162320 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.161384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerDied","Data":"f3e1c1bf1a01f314ab174a0365a3246011cbef59af069904da466c4c9748af54"} Apr 16 23:50:53.165409 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.165385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tnq5s" event={"ID":"c517f342-1f92-4fea-88bf-76e1e2f71358","Type":"ContainerStarted","Data":"b9f16b5d83d049d06d6c294e48048e12228f47afabca8cd14d48a8fb6cc2fcfd"} Apr 16 23:50:53.184894 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:53.184840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:53.185072 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.185054 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:53.185147 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:53.185118 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:50:57.185100029 +0000 UTC m=+41.731090593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:50:54.172324 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:54.172269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" event={"ID":"4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc","Type":"ContainerStarted","Data":"e7e0a82f3ef42628fd2f886172f4fb58d9065ae2742d27fb1612f3c928d92e6c"} Apr 16 23:50:56.000886 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:56.000820 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x5gfp" podStartSLOduration=7.341114858 podStartE2EDuration="40.000807083s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:50:18.738473637 +0000 UTC m=+3.284464201" lastFinishedPulling="2026-04-16 23:50:51.398165848 +0000 UTC m=+35.944156426" observedRunningTime="2026-04-16 23:50:54.202129562 +0000 UTC m=+38.748120146" watchObservedRunningTime="2026-04-16 23:50:56.000807083 +0000 UTC m=+40.546797677" Apr 16 23:50:57.122102 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:57.122067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:57.122181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.122219 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.122242 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.122294 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.122299 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:05.122282795 +0000 UTC m=+49.668273355 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:50:57.122529 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.122350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:51:05.122334817 +0000 UTC m=+49.668325385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:50:57.222932 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:50:57.222894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:50:57.223124 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.223046 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:57.223190 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:50:57.223126 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:51:05.223105113 +0000 UTC m=+49.769095676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:51:00.186252 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.186170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" event={"ID":"c0b88722-fb4c-4375-8c87-0f9c5b719af9","Type":"ContainerStarted","Data":"aa47a015fc7fd9d87c137d992df431a51171c9a76548a7dbac2f6320742eb6bb"} Apr 16 23:51:00.186724 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.186349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:51:00.187577 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.187540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerStarted","Data":"ea4e8af77fe6a4207836b638940c6f9f25ab4ab8a515f9d9884f12decddea273"} Apr 16 23:51:00.188343 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.188327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:51:00.188984 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.188961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" event={"ID":"1c5c1538-0dba-4c64-8be2-7925c1e9905a","Type":"ContainerStarted","Data":"f06c32496dfdb721b33dfefc3e81beca104997612bcc07bf6fda17958ffca9e6"} Apr 16 23:51:00.190299 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.190276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tnq5s" event={"ID":"c517f342-1f92-4fea-88bf-76e1e2f71358","Type":"ContainerStarted","Data":"4127e934ee236132073117cfe5080b156141edda566455be5fb3f67f836f2105"} Apr 16 23:51:00.202991 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.202956 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" podStartSLOduration=25.821227666 podStartE2EDuration="34.20294396s" podCreationTimestamp="2026-04-16 23:50:26 +0000 UTC" firstStartedPulling="2026-04-16 23:50:51.374230549 +0000 UTC m=+35.920221116" lastFinishedPulling="2026-04-16 23:50:59.755946835 +0000 UTC m=+44.301937410" observedRunningTime="2026-04-16 23:51:00.201944101 +0000 UTC m=+44.747934674" watchObservedRunningTime="2026-04-16 23:51:00.20294396 +0000 UTC m=+44.748934533" Apr 16 23:51:00.214305 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.214269 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tnq5s" podStartSLOduration=32.761887624 podStartE2EDuration="40.214258326s" podCreationTimestamp="2026-04-16 23:50:20 +0000 UTC" firstStartedPulling="2026-04-16 23:50:52.313256634 +0000 UTC m=+36.859247208" lastFinishedPulling="2026-04-16 23:50:59.765627347 +0000 UTC m=+44.311617910" observedRunningTime="2026-04-16 23:51:00.213261421 +0000 UTC m=+44.759252002" watchObservedRunningTime="2026-04-16 23:51:00.214258326 +0000 UTC m=+44.760248905" Apr 16 23:51:00.225831 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:00.225797 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" podStartSLOduration=25.845305985 podStartE2EDuration="34.225786603s" podCreationTimestamp="2026-04-16 23:50:26 +0000 UTC" firstStartedPulling="2026-04-16 23:50:51.374431471 +0000 UTC m=+35.920422034" lastFinishedPulling="2026-04-16 23:50:59.754912068 +0000 UTC m=+44.300902652" observedRunningTime="2026-04-16 23:51:00.225286132 +0000 UTC m=+44.771276715" watchObservedRunningTime="2026-04-16 23:51:00.225786603 +0000 UTC m=+44.771777536" Apr 16 23:51:03.198996 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:03.198956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerStarted","Data":"98ab7bc435f542104a5c081a7bbb25691beac79306ef8329b916247b706b63c1"} Apr 16 23:51:03.198996 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:03.198999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerStarted","Data":"99171eabd2a707f16d67b4088211a3cd7eb784eee9efae59089400304d2381ae"} Apr 16 23:51:03.215484 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:03.215435 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" podStartSLOduration=24.999006364 podStartE2EDuration="36.215422438s" podCreationTimestamp="2026-04-16 23:50:27 +0000 UTC" firstStartedPulling="2026-04-16 23:50:51.374231545 +0000 UTC m=+35.920222109" lastFinishedPulling="2026-04-16 23:51:02.590647618 +0000 UTC m=+47.136638183" observedRunningTime="2026-04-16 23:51:03.214180324 +0000 UTC m=+47.760170906" watchObservedRunningTime="2026-04-16 23:51:03.215422438 +0000 UTC m=+47.761413032" Apr 16 23:51:05.181471 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:05.181433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:05.181543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.181579 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.181596 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.181635 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.181659 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:21.181641613 +0000 UTC m=+65.727632177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:51:05.181828 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.181677 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:51:21.181668168 +0000 UTC m=+65.727658729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:51:05.282373 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:05.282347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:51:05.282505 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.282487 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:05.282556 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:05.282548 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:51:21.282532362 +0000 UTC m=+65.828522922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:51:15.121273 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:15.121240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btfdz" Apr 16 23:51:21.197077 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.197043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.197087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.197178 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.197189 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.197191 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.197239 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:53.197225624 +0000 UTC m=+97.743216184 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:51:21.197549 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.197250 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:51:53.197245008 +0000 UTC m=+97.743235568 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:51:21.298448 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.298416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:51:21.298578 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.298533 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:21.298617 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.298582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:51:53.298568243 +0000 UTC m=+97.844558802 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:51:21.701133 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.701096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:51:21.703217 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.703200 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:51:21.711669 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.711652 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:51:21.711736 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:21.711703 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:25.711689223 +0000 UTC m=+130.257679783 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : secret "metrics-daemon-secret" not found Apr 16 23:51:21.801742 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.801720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:51:21.804009 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.803993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:51:21.814541 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.814524 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:51:21.826642 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:21.826620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clp8z\" (UniqueName: \"kubernetes.io/projected/74530a0f-97e2-4c5b-a142-39f048b22670-kube-api-access-clp8z\") pod \"network-check-target-qgk96\" (UID: \"74530a0f-97e2-4c5b-a142-39f048b22670\") " pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:51:22.108578 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:22.108550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:51:22.116948 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:22.116909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:51:22.229972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:22.229944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qgk96"] Apr 16 23:51:22.233438 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:51:22.233411 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74530a0f_97e2_4c5b_a142_39f048b22670.slice/crio-268a087fb163580ee4915135dd592a2c8ab22fc606be3526ec6596410ccbb0a5 WatchSource:0}: Error finding container 268a087fb163580ee4915135dd592a2c8ab22fc606be3526ec6596410ccbb0a5: Status 404 returned error can't find the container with id 268a087fb163580ee4915135dd592a2c8ab22fc606be3526ec6596410ccbb0a5 Apr 16 23:51:22.245533 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:22.245509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qgk96" event={"ID":"74530a0f-97e2-4c5b-a142-39f048b22670","Type":"ContainerStarted","Data":"268a087fb163580ee4915135dd592a2c8ab22fc606be3526ec6596410ccbb0a5"} Apr 16 23:51:26.256210 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:26.256174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qgk96" event={"ID":"74530a0f-97e2-4c5b-a142-39f048b22670","Type":"ContainerStarted","Data":"b6ae15ae7b5db79a72739859f390542a44f86caf67d87b0c2b1a7b28464d0a5f"} Apr 16 23:51:26.256594 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:26.256340 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:51:26.270446 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:26.270401 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qgk96" podStartSLOduration=67.01004108 podStartE2EDuration="1m10.270389314s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:51:22.235792681 +0000 UTC m=+66.781783241" lastFinishedPulling="2026-04-16 23:51:25.496140902 +0000 UTC m=+70.042131475" observedRunningTime="2026-04-16 23:51:26.269938085 +0000 UTC m=+70.815928661" watchObservedRunningTime="2026-04-16 23:51:26.270389314 +0000 UTC m=+70.816379896" Apr 16 23:51:53.223227 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:53.223101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:51:53.223227 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:53.223152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:51:53.223684 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.223261 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:53.223684 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.223267 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:51:53.223684 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.223284 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd45ff64f-99klg: secret "image-registry-tls" not found Apr 16 23:51:53.223684 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.223341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls podName:aca02152-9722-4dd6-bafd-8732fb7ae807 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:57.223328026 +0000 UTC m=+161.769318587 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls") pod "image-registry-fd45ff64f-99klg" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807") : secret "image-registry-tls" not found Apr 16 23:51:53.223684 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.223352 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert podName:bd90a043-dac9-4e2d-96a1-4eabae55281f nodeName:}" failed. No retries permitted until 2026-04-16 23:52:57.223347145 +0000 UTC m=+161.769337705 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert") pod "ingress-canary-k8zp4" (UID: "bd90a043-dac9-4e2d-96a1-4eabae55281f") : secret "canary-serving-cert" not found Apr 16 23:51:53.324274 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:53.324239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:51:53.324416 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.324374 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:53.324458 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:51:53.324426 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls podName:7f0fc05f-8b6e-4d52-8962-dda3bf88746a nodeName:}" failed. No retries permitted until 2026-04-16 23:52:57.324412119 +0000 UTC m=+161.870402679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls") pod "dns-default-fmhdc" (UID: "7f0fc05f-8b6e-4d52-8962-dda3bf88746a") : secret "dns-default-metrics-tls" not found Apr 16 23:51:57.261109 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:51:57.261078 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qgk96" Apr 16 23:52:22.351745 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:22.351717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-747kz_3d193d29-372b-44a9-a007-2f9fd389e08e/dns-node-resolver/0.log" Apr 16 23:52:23.352303 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:23.352274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4ldzm_a6c8c77b-23d8-444b-93d9-efd10d5f4f5b/node-ca/0.log" Apr 16 23:52:25.748970 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:25.748909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:52:25.749347 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:25.749058 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:52:25.749347 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:25.749121 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs podName:c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7 nodeName:}" failed. No retries permitted until 2026-04-16 23:54:27.749102809 +0000 UTC m=+252.295093371 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs") pod "network-metrics-daemon-ktsl6" (UID: "c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7") : secret "metrics-daemon-secret" not found Apr 16 23:52:44.980431 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.980394 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9q77n"] Apr 16 23:52:44.983392 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.983372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:44.985462 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.985440 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:52:44.985580 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.985493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:52:44.986089 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.986072 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:52:44.986320 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.986303 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:52:44.986438 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.986420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kxrl6\"" Apr 16 23:52:44.993253 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:44.993233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9q77n"] Apr 16 23:52:45.095307 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.095276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctm7m\" (UniqueName: \"kubernetes.io/projected/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-api-access-ctm7m\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.095474 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.095310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fef4520-3fbc-4e8b-a599-35236b6977ca-data-volume\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.095474 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.095341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fef4520-3fbc-4e8b-a599-35236b6977ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.095474 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.095461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.095601 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.095557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fef4520-3fbc-4e8b-a599-35236b6977ca-crio-socket\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196020 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.195988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fef4520-3fbc-4e8b-a599-35236b6977ca-crio-socket\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196167 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctm7m\" (UniqueName: \"kubernetes.io/projected/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-api-access-ctm7m\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196167 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fef4520-3fbc-4e8b-a599-35236b6977ca-data-volume\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196167 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fef4520-3fbc-4e8b-a599-35236b6977ca-crio-socket\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196329 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fef4520-3fbc-4e8b-a599-35236b6977ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196329 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196433 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fef4520-3fbc-4e8b-a599-35236b6977ca-data-volume\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.196743 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.196726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.198463 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.198445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fef4520-3fbc-4e8b-a599-35236b6977ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.204241 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.204217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctm7m\" (UniqueName: \"kubernetes.io/projected/3fef4520-3fbc-4e8b-a599-35236b6977ca-kube-api-access-ctm7m\") pod \"insights-runtime-extractor-9q77n\" (UID: \"3fef4520-3fbc-4e8b-a599-35236b6977ca\") " pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.292186 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.292133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9q77n" Apr 16 23:52:45.409099 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.409070 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9q77n"] Apr 16 23:52:45.412590 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:52:45.412563 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fef4520_3fbc_4e8b_a599_35236b6977ca.slice/crio-440cea9049278d8174c595d9e7a7cf33558913e75cdff99b4c46a3c39e39fd52 WatchSource:0}: Error finding container 440cea9049278d8174c595d9e7a7cf33558913e75cdff99b4c46a3c39e39fd52: Status 404 returned error can't find the container with id 440cea9049278d8174c595d9e7a7cf33558913e75cdff99b4c46a3c39e39fd52 Apr 16 23:52:45.437668 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:45.437642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9q77n" event={"ID":"3fef4520-3fbc-4e8b-a599-35236b6977ca","Type":"ContainerStarted","Data":"440cea9049278d8174c595d9e7a7cf33558913e75cdff99b4c46a3c39e39fd52"} Apr 16 23:52:46.441861 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:46.441786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9q77n" event={"ID":"3fef4520-3fbc-4e8b-a599-35236b6977ca","Type":"ContainerStarted","Data":"2f7fa8d401a5859ce38850d28956cc02ab93aa85127477d2c55a6f8a30bd60e7"} Apr 16 23:52:46.441861 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:46.441820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9q77n" event={"ID":"3fef4520-3fbc-4e8b-a599-35236b6977ca","Type":"ContainerStarted","Data":"0f149e77c2845535f95d000055225218f5362941c1888fd0c800bd091aefed81"} Apr 16 23:52:48.448238 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:48.448202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9q77n" event={"ID":"3fef4520-3fbc-4e8b-a599-35236b6977ca","Type":"ContainerStarted","Data":"72835cd3fe1a8e274bdaf2a138ef00bec4ca1a1de99b82156bd2586a877b31c0"} Apr 16 23:52:48.464044 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:48.463998 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9q77n" podStartSLOduration=2.269797751 podStartE2EDuration="4.463984184s" podCreationTimestamp="2026-04-16 23:52:44 +0000 UTC" firstStartedPulling="2026-04-16 23:52:45.465863702 +0000 UTC m=+150.011854268" lastFinishedPulling="2026-04-16 23:52:47.660050141 +0000 UTC m=+152.206040701" observedRunningTime="2026-04-16 23:52:48.462615449 +0000 UTC m=+153.008606040" watchObservedRunningTime="2026-04-16 23:52:48.463984184 +0000 UTC m=+153.009974765" Apr 16 23:52:51.951865 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.951828 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v8w79"] Apr 16 23:52:51.955153 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.955131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:51.957003 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.956976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:52:51.957152 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957131 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:52:51.957225 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:52:51.957225 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:52:51.957569 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:52:51.957569 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fhzzn\"" Apr 16 23:52:51.957697 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:51.957591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:52:52.049638 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-metrics-client-ca\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049638 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-root\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-wtmp\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rn8h\" (UniqueName: \"kubernetes.io/projected/c1bd9626-2f26-4387-95e3-983d628930db-kube-api-access-8rn8h\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-sys\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-tls\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049986 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.049986 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.049913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-textfile\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.150870 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.150841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-metrics-client-ca\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.150880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-root\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.150906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-wtmp\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.150979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-root\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151180 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rn8h\" (UniqueName: \"kubernetes.io/projected/c1bd9626-2f26-4387-95e3-983d628930db-kube-api-access-8rn8h\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151180 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-sys\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151180 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-tls\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151180 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-wtmp\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151180 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bd9626-2f26-4387-95e3-983d628930db-sys\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151397 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151397 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151397 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-textfile\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151532 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-metrics-client-ca\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151595 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-textfile\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.151687 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.151667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.153458 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.153437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-tls\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.153608 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.153588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1bd9626-2f26-4387-95e3-983d628930db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.157611 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.157588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rn8h\" (UniqueName: \"kubernetes.io/projected/c1bd9626-2f26-4387-95e3-983d628930db-kube-api-access-8rn8h\") pod \"node-exporter-v8w79\" (UID: \"c1bd9626-2f26-4387-95e3-983d628930db\") " pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.263972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.263889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v8w79" Apr 16 23:52:52.271985 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:52:52.271960 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1bd9626_2f26_4387_95e3_983d628930db.slice/crio-1b6550f6fde5256f11c6e5f199659f319eed4d14b4f404c3ed4e8e9c54609fe9 WatchSource:0}: Error finding container 1b6550f6fde5256f11c6e5f199659f319eed4d14b4f404c3ed4e8e9c54609fe9: Status 404 returned error can't find the container with id 1b6550f6fde5256f11c6e5f199659f319eed4d14b4f404c3ed4e8e9c54609fe9 Apr 16 23:52:52.347234 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:52.347196 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" Apr 16 23:52:52.368509 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:52.368486 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-k8zp4" podUID="bd90a043-dac9-4e2d-96a1-4eabae55281f" Apr 16 23:52:52.431182 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:52.431151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fmhdc" podUID="7f0fc05f-8b6e-4d52-8962-dda3bf88746a" Apr 16 23:52:52.457269 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.457249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fmhdc" Apr 16 23:52:52.457370 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.457268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8w79" event={"ID":"c1bd9626-2f26-4387-95e3-983d628930db","Type":"ContainerStarted","Data":"1b6550f6fde5256f11c6e5f199659f319eed4d14b4f404c3ed4e8e9c54609fe9"} Apr 16 23:52:52.457370 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.457289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:52:52.457435 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:52.457388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:52:53.000774 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:52:53.000737 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ktsl6" podUID="c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7" Apr 16 23:52:53.461142 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:53.461106 2576 generic.go:358] "Generic (PLEG): container finished" podID="c1bd9626-2f26-4387-95e3-983d628930db" containerID="f32db5be593a89c6efed2fd07286158a13cd1edef93023780fe1db0ed8120d42" exitCode=0 Apr 16 23:52:53.461307 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:53.461152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8w79" event={"ID":"c1bd9626-2f26-4387-95e3-983d628930db","Type":"ContainerDied","Data":"f32db5be593a89c6efed2fd07286158a13cd1edef93023780fe1db0ed8120d42"} Apr 16 23:52:54.465694 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:54.465655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8w79" event={"ID":"c1bd9626-2f26-4387-95e3-983d628930db","Type":"ContainerStarted","Data":"fa21b7a262dcc0216ba7fa5acc7b3c5045c348da86c0d945aeddd06325b666d6"} Apr 16 23:52:54.465694 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:54.465699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8w79" event={"ID":"c1bd9626-2f26-4387-95e3-983d628930db","Type":"ContainerStarted","Data":"bad821626d2298213d0b76466723fc89e1ba2cf596cbc82c628482064481fb1f"} Apr 16 23:52:54.481586 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:54.481543 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v8w79" podStartSLOduration=2.606763461 podStartE2EDuration="3.481530528s" podCreationTimestamp="2026-04-16 23:52:51 +0000 UTC" firstStartedPulling="2026-04-16 23:52:52.273697943 +0000 UTC m=+156.819688502" lastFinishedPulling="2026-04-16 23:52:53.148464992 +0000 UTC m=+157.694455569" observedRunningTime="2026-04-16 23:52:54.480511452 +0000 UTC m=+159.026502034" watchObservedRunningTime="2026-04-16 23:52:54.481530528 +0000 UTC m=+159.027521110" Apr 16 23:52:57.290791 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.290755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:52:57.291205 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.290836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:52:57.293248 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.293226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"image-registry-fd45ff64f-99klg\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:52:57.293374 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.293353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd90a043-dac9-4e2d-96a1-4eabae55281f-cert\") pod \"ingress-canary-k8zp4\" (UID: \"bd90a043-dac9-4e2d-96a1-4eabae55281f\") " pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:52:57.391567 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.391532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:52:57.393664 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.393643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f0fc05f-8b6e-4d52-8962-dda3bf88746a-metrics-tls\") pod \"dns-default-fmhdc\" (UID: \"7f0fc05f-8b6e-4d52-8962-dda3bf88746a\") " pod="openshift-dns/dns-default-fmhdc" Apr 16 23:52:57.560481 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.560456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:52:57.560604 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.560541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:52:57.560658 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.560623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nf568\"" Apr 16 23:52:57.568841 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.568822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:52:57.568956 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.568906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8zp4" Apr 16 23:52:57.568956 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.568940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fmhdc" Apr 16 23:52:57.708690 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.708654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:52:57.712419 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:52:57.712391 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca02152_9722_4dd6_bafd_8732fb7ae807.slice/crio-f34720f67fae2939daba61f484c75b34160b79c8a98ecdf7d96092fa1c71b67f WatchSource:0}: Error finding container f34720f67fae2939daba61f484c75b34160b79c8a98ecdf7d96092fa1c71b67f: Status 404 returned error can't find the container with id f34720f67fae2939daba61f484c75b34160b79c8a98ecdf7d96092fa1c71b67f Apr 16 23:52:57.925405 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.925335 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k8zp4"] Apr 16 23:52:57.928471 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:57.928449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fmhdc"] Apr 16 23:52:57.928621 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:52:57.928599 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd90a043_dac9_4e2d_96a1_4eabae55281f.slice/crio-cf19e807db9f5df11de881bdace28cc18125a0990f4cdf1cffe5e75a839935db WatchSource:0}: Error finding container cf19e807db9f5df11de881bdace28cc18125a0990f4cdf1cffe5e75a839935db: Status 404 returned error can't find the container with id cf19e807db9f5df11de881bdace28cc18125a0990f4cdf1cffe5e75a839935db Apr 16 23:52:57.931431 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:52:57.931409 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f0fc05f_8b6e_4d52_8962_dda3bf88746a.slice/crio-d994a216c70b99b7886f75d4da663fc5f1859dc12b0894cef914ae2c2b2d953b WatchSource:0}: Error finding container d994a216c70b99b7886f75d4da663fc5f1859dc12b0894cef914ae2c2b2d953b: Status 404 returned error can't find the container with id d994a216c70b99b7886f75d4da663fc5f1859dc12b0894cef914ae2c2b2d953b Apr 16 23:52:58.475446 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.475408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k8zp4" event={"ID":"bd90a043-dac9-4e2d-96a1-4eabae55281f","Type":"ContainerStarted","Data":"cf19e807db9f5df11de881bdace28cc18125a0990f4cdf1cffe5e75a839935db"} Apr 16 23:52:58.476783 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.476749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" event={"ID":"aca02152-9722-4dd6-bafd-8732fb7ae807","Type":"ContainerStarted","Data":"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6"} Apr 16 23:52:58.476783 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.476781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" event={"ID":"aca02152-9722-4dd6-bafd-8732fb7ae807","Type":"ContainerStarted","Data":"f34720f67fae2939daba61f484c75b34160b79c8a98ecdf7d96092fa1c71b67f"} Apr 16 23:52:58.477001 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.476868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:52:58.477789 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.477765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fmhdc" event={"ID":"7f0fc05f-8b6e-4d52-8962-dda3bf88746a","Type":"ContainerStarted","Data":"d994a216c70b99b7886f75d4da663fc5f1859dc12b0894cef914ae2c2b2d953b"} Apr 16 23:52:58.492826 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:52:58.492790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" podStartSLOduration=162.492778805 podStartE2EDuration="2m42.492778805s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:58.492090578 +0000 UTC m=+163.038081160" watchObservedRunningTime="2026-04-16 23:52:58.492778805 +0000 UTC m=+163.038769381" Apr 16 23:53:00.187100 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.187050 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" podUID="c0b88722-fb4c-4375-8c87-0f9c5b719af9" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 16 23:53:00.489704 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.489619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fmhdc" event={"ID":"7f0fc05f-8b6e-4d52-8962-dda3bf88746a","Type":"ContainerStarted","Data":"37054ac37eccdd51b01039430031e8cae1874c64c91f1b651c93e75ae5314b68"} Apr 16 23:53:00.489704 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.489664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fmhdc" event={"ID":"7f0fc05f-8b6e-4d52-8962-dda3bf88746a","Type":"ContainerStarted","Data":"0d5c4061e41e084e3dee7741813077a88e8949b48de318b6ce049d29aa8f23d7"} Apr 16 23:53:00.489912 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.489767 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fmhdc" Apr 16 23:53:00.490938 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.490897 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c5c1538-0dba-4c64-8be2-7925c1e9905a" containerID="f06c32496dfdb721b33dfefc3e81beca104997612bcc07bf6fda17958ffca9e6" exitCode=255 Apr 16 23:53:00.491057 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.490976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" event={"ID":"1c5c1538-0dba-4c64-8be2-7925c1e9905a","Type":"ContainerDied","Data":"f06c32496dfdb721b33dfefc3e81beca104997612bcc07bf6fda17958ffca9e6"} Apr 16 23:53:00.491305 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.491282 2576 scope.go:117] "RemoveContainer" containerID="f06c32496dfdb721b33dfefc3e81beca104997612bcc07bf6fda17958ffca9e6" Apr 16 23:53:00.492259 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.492227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k8zp4" event={"ID":"bd90a043-dac9-4e2d-96a1-4eabae55281f","Type":"ContainerStarted","Data":"608b900a16309dbe6fca6a84d197041252fdaf48c16dbdf666afe0a28bda74b1"} Apr 16 23:53:00.493837 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.493820 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0b88722-fb4c-4375-8c87-0f9c5b719af9" containerID="aa47a015fc7fd9d87c137d992df431a51171c9a76548a7dbac2f6320742eb6bb" exitCode=1 Apr 16 23:53:00.493884 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.493857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" event={"ID":"c0b88722-fb4c-4375-8c87-0f9c5b719af9","Type":"ContainerDied","Data":"aa47a015fc7fd9d87c137d992df431a51171c9a76548a7dbac2f6320742eb6bb"} Apr 16 23:53:00.494135 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.494123 2576 scope.go:117] "RemoveContainer" containerID="aa47a015fc7fd9d87c137d992df431a51171c9a76548a7dbac2f6320742eb6bb" Apr 16 23:53:00.505701 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.505533 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fmhdc" podStartSLOduration=129.534711636 podStartE2EDuration="2m11.505518594s" podCreationTimestamp="2026-04-16 23:50:49 +0000 UTC" firstStartedPulling="2026-04-16 23:52:57.933223144 +0000 UTC m=+162.479213707" lastFinishedPulling="2026-04-16 23:52:59.904030092 +0000 UTC m=+164.450020665" observedRunningTime="2026-04-16 23:53:00.503958744 +0000 UTC m=+165.049949324" watchObservedRunningTime="2026-04-16 23:53:00.505518594 +0000 UTC m=+165.051509178" Apr 16 23:53:00.542665 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:00.542626 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k8zp4" podStartSLOduration=129.565726049 podStartE2EDuration="2m11.542607799s" podCreationTimestamp="2026-04-16 23:50:49 +0000 UTC" firstStartedPulling="2026-04-16 23:52:57.930615903 +0000 UTC m=+162.476606463" lastFinishedPulling="2026-04-16 23:52:59.907497653 +0000 UTC m=+164.453488213" observedRunningTime="2026-04-16 23:53:00.542080739 +0000 UTC m=+165.088071323" watchObservedRunningTime="2026-04-16 23:53:00.542607799 +0000 UTC m=+165.088598381" Apr 16 23:53:01.498427 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:01.498392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7695465d5f-ddqwj" event={"ID":"1c5c1538-0dba-4c64-8be2-7925c1e9905a","Type":"ContainerStarted","Data":"e6b8c43763b9527d2a06f575a4cf63f520b852e49df94ff7dba286468a436c21"} Apr 16 23:53:01.499931 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:01.499894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" event={"ID":"c0b88722-fb4c-4375-8c87-0f9c5b719af9","Type":"ContainerStarted","Data":"e7e0665a55a5d4bff47d4716084b275bb2763ff6e25d30788c292f7abd8a3010"} Apr 16 23:53:01.500366 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:01.500346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:53:01.500962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:01.500944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b484578cc-xjc9v" Apr 16 23:53:04.977315 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:04.977224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:53:10.501959 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:10.501900 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fmhdc" Apr 16 23:53:17.296032 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:17.295999 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:53:17.300274 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:17.300246 2576 patch_prober.go:28] interesting pod/image-registry-fd45ff64f-99klg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:53:17.300406 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:17.300294 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:53:19.655289 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:19.655242 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" podUID="f1774876-46b7-47d7-802c-a56cb85674a5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:53:27.300749 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:27.300686 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:53:29.655656 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:29.655614 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" podUID="f1774876-46b7-47d7-802c-a56cb85674a5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:53:39.656132 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:39.656093 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" podUID="f1774876-46b7-47d7-802c-a56cb85674a5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:53:39.656571 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:39.656158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" Apr 16 23:53:39.656618 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:39.656589 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"98ab7bc435f542104a5c081a7bbb25691beac79306ef8329b916247b706b63c1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 23:53:39.656666 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:39.656635 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" podUID="f1774876-46b7-47d7-802c-a56cb85674a5" containerName="service-proxy" containerID="cri-o://98ab7bc435f542104a5c081a7bbb25691beac79306ef8329b916247b706b63c1" gracePeriod=30 Apr 16 23:53:40.597929 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:40.597878 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1774876-46b7-47d7-802c-a56cb85674a5" containerID="98ab7bc435f542104a5c081a7bbb25691beac79306ef8329b916247b706b63c1" exitCode=2 Apr 16 23:53:40.598070 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:40.597956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerDied","Data":"98ab7bc435f542104a5c081a7bbb25691beac79306ef8329b916247b706b63c1"} Apr 16 23:53:40.598070 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:40.597996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57bc76658b-fz7jb" event={"ID":"f1774876-46b7-47d7-802c-a56cb85674a5","Type":"ContainerStarted","Data":"00e44d3c3d85e37d330e466be9693b4d134fce95f4f064b2ac7346865387088d"} Apr 16 23:53:42.314652 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.314611 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerName="registry" containerID="cri-o://e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6" gracePeriod=30 Apr 16 23:53:42.546256 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.546235 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:53:42.604018 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.603951 2576 generic.go:358] "Generic (PLEG): container finished" podID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerID="e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6" exitCode=0 Apr 16 23:53:42.604018 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.603995 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" Apr 16 23:53:42.604237 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.604032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" event={"ID":"aca02152-9722-4dd6-bafd-8732fb7ae807","Type":"ContainerDied","Data":"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6"} Apr 16 23:53:42.604237 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.604079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd45ff64f-99klg" event={"ID":"aca02152-9722-4dd6-bafd-8732fb7ae807","Type":"ContainerDied","Data":"f34720f67fae2939daba61f484c75b34160b79c8a98ecdf7d96092fa1c71b67f"} Apr 16 23:53:42.604237 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.604100 2576 scope.go:117] "RemoveContainer" containerID="e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6" Apr 16 23:53:42.611742 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.611725 2576 scope.go:117] "RemoveContainer" containerID="e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6" Apr 16 23:53:42.612051 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:53:42.612030 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6\": container with ID starting with e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6 not found: ID does not exist" containerID="e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6" Apr 16 23:53:42.612117 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.612061 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6"} err="failed to get container status \"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6\": rpc error: code = NotFound desc = could not find container \"e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6\": container with ID starting with e238770dea92f6131ae98338bc4fcf669f2017704e592916f5e50adae27a4fc6 not found: ID does not exist" Apr 16 23:53:42.712459 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712437 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4b26\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712551 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712551 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712497 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712551 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712537 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712660 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712660 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712660 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712632 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.712801 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.712676 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets\") pod \"aca02152-9722-4dd6-bafd-8732fb7ae807\" (UID: \"aca02152-9722-4dd6-bafd-8732fb7ae807\") " Apr 16 23:53:42.713103 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.713039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:53:42.713546 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.713524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:53:42.715429 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.715398 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26" (OuterVolumeSpecName: "kube-api-access-w4b26") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "kube-api-access-w4b26". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:42.715536 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.715397 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:53:42.715536 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.715497 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:42.715640 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.715563 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:53:42.715640 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.715597 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:42.721550 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.721524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aca02152-9722-4dd6-bafd-8732fb7ae807" (UID: "aca02152-9722-4dd6-bafd-8732fb7ae807"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:53:42.814241 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814216 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aca02152-9722-4dd6-bafd-8732fb7ae807-ca-trust-extracted\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814248 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-certificates\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814265 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-registry-tls\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814279 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-installation-pull-secrets\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814294 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4b26\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-kube-api-access-w4b26\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814308 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aca02152-9722-4dd6-bafd-8732fb7ae807-image-registry-private-configuration\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814325 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814322 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca02152-9722-4dd6-bafd-8732fb7ae807-bound-sa-token\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.814506 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.814336 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aca02152-9722-4dd6-bafd-8732fb7ae807-trusted-ca\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:53:42.923083 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.923057 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:53:42.926338 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:42.926319 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-fd45ff64f-99klg"] Apr 16 23:53:43.980747 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:43.980714 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" path="/var/lib/kubelet/pods/aca02152-9722-4dd6-bafd-8732fb7ae807/volumes" Apr 16 23:53:50.378972 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:50.378943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/init-textfile/0.log" Apr 16 23:53:50.579137 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:50.579111 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/node-exporter/0.log" Apr 16 23:53:50.778580 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:53:50.778502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/kube-rbac-proxy/0.log" Apr 16 23:54:27.801519 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:27.801485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:54:27.803876 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:27.803856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7-metrics-certs\") pod \"network-metrics-daemon-ktsl6\" (UID: \"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7\") " pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:54:28.080067 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:28.079995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:54:28.088539 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:28.088522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktsl6" Apr 16 23:54:28.203349 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:28.203312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktsl6"] Apr 16 23:54:28.207316 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:54:28.207285 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74b0b0a_4b58_4ad4_9c2b_fcd27ad15de7.slice/crio-b8f0b1cb00598b09b15b85cf5ce42297b0389802b52440a5e22ea7aa40392079 WatchSource:0}: Error finding container b8f0b1cb00598b09b15b85cf5ce42297b0389802b52440a5e22ea7aa40392079: Status 404 returned error can't find the container with id b8f0b1cb00598b09b15b85cf5ce42297b0389802b52440a5e22ea7aa40392079 Apr 16 23:54:28.725737 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:28.725698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktsl6" event={"ID":"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7","Type":"ContainerStarted","Data":"b8f0b1cb00598b09b15b85cf5ce42297b0389802b52440a5e22ea7aa40392079"} Apr 16 23:54:29.730265 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:29.730232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktsl6" event={"ID":"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7","Type":"ContainerStarted","Data":"615a530b35e9d56a3c57303ae8c41f397c8a8f19d798a82437c63d44a92ebb87"} Apr 16 23:54:29.730265 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:29.730270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktsl6" event={"ID":"c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7","Type":"ContainerStarted","Data":"9822534edc72886aef1c8f701996a3526d7f64f4490ee7bf672beff9c5a71e38"} Apr 16 23:54:29.745605 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:54:29.745553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ktsl6" podStartSLOduration=252.714001619 podStartE2EDuration="4m13.745537213s" podCreationTimestamp="2026-04-16 23:50:16 +0000 UTC" firstStartedPulling="2026-04-16 23:54:28.209007417 +0000 UTC m=+252.754997980" lastFinishedPulling="2026-04-16 23:54:29.240543011 +0000 UTC m=+253.786533574" observedRunningTime="2026-04-16 23:54:29.744621365 +0000 UTC m=+254.290611947" watchObservedRunningTime="2026-04-16 23:54:29.745537213 +0000 UTC m=+254.291527796" Apr 16 23:55:15.898351 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:15.898323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:55:15.899568 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:15.899547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 16 23:55:15.904174 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:15.904155 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 23:55:38.561548 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.561514 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mz4rv"] Apr 16 23:55:38.564050 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.561828 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerName="registry" Apr 16 23:55:38.564050 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.561848 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerName="registry" Apr 16 23:55:38.564050 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.561953 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aca02152-9722-4dd6-bafd-8732fb7ae807" containerName="registry" Apr 16 23:55:38.564958 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.564940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.566738 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.566717 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 23:55:38.566833 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.566720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-z92mx\"" Apr 16 23:55:38.567238 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.567222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 23:55:38.573850 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.573826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mz4rv"] Apr 16 23:55:38.668039 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.668013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.668161 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.668067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pp4\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-kube-api-access-f7pp4\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.768387 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.768365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.768525 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.768414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pp4\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-kube-api-access-f7pp4\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.776267 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.776241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.776413 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.776356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pp4\" (UniqueName: \"kubernetes.io/projected/1b7c83b2-95d5-4367-bc9b-6432584147a9-kube-api-access-f7pp4\") pod \"cert-manager-cainjector-8966b78d4-mz4rv\" (UID: \"1b7c83b2-95d5-4367-bc9b-6432584147a9\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.873648 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.873584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" Apr 16 23:55:38.986168 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.986137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mz4rv"] Apr 16 23:55:38.989445 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:55:38.989418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b7c83b2_95d5_4367_bc9b_6432584147a9.slice/crio-826cf074bf93b694e95055825725de026b21fd57b243e3a4e86b4642321f03aa WatchSource:0}: Error finding container 826cf074bf93b694e95055825725de026b21fd57b243e3a4e86b4642321f03aa: Status 404 returned error can't find the container with id 826cf074bf93b694e95055825725de026b21fd57b243e3a4e86b4642321f03aa Apr 16 23:55:38.991218 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:38.991202 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:55:39.908870 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:39.908827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" event={"ID":"1b7c83b2-95d5-4367-bc9b-6432584147a9","Type":"ContainerStarted","Data":"826cf074bf93b694e95055825725de026b21fd57b243e3a4e86b4642321f03aa"} Apr 16 23:55:42.918184 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:42.918097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" event={"ID":"1b7c83b2-95d5-4367-bc9b-6432584147a9","Type":"ContainerStarted","Data":"79551913e3ab9ff2c95eeeeeeb6fe70b52ff2d83b15c509b4825ba7afc777196"} Apr 16 23:55:42.931332 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:42.931283 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-mz4rv" podStartSLOduration=1.342974603 podStartE2EDuration="4.93125766s" podCreationTimestamp="2026-04-16 23:55:38 +0000 UTC" firstStartedPulling="2026-04-16 23:55:38.991337333 +0000 UTC m=+323.537327892" lastFinishedPulling="2026-04-16 23:55:42.579620386 +0000 UTC m=+327.125610949" observedRunningTime="2026-04-16 23:55:42.930644502 +0000 UTC m=+327.476635075" watchObservedRunningTime="2026-04-16 23:55:42.93125766 +0000 UTC m=+327.477248242" Apr 16 23:55:55.486817 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.486781 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-mw6gk"] Apr 16 23:55:55.492546 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.492529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.494626 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.494603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-z67fr\"" Apr 16 23:55:55.495931 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.495889 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mw6gk"] Apr 16 23:55:55.587348 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.587322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxbwh\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-kube-api-access-wxbwh\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.587497 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.587380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-bound-sa-token\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.688666 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.688621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxbwh\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-kube-api-access-wxbwh\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.688776 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.688691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-bound-sa-token\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.695797 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.695762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-bound-sa-token\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.695892 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.695846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxbwh\" (UniqueName: \"kubernetes.io/projected/0716a640-755e-44e4-b055-492695e584f9-kube-api-access-wxbwh\") pod \"cert-manager-759f64656b-mw6gk\" (UID: \"0716a640-755e-44e4-b055-492695e584f9\") " pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.802611 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.802590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mw6gk" Apr 16 23:55:55.921402 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.917767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mw6gk"] Apr 16 23:55:55.922699 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:55:55.922671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0716a640_755e_44e4_b055_492695e584f9.slice/crio-4bdb5637d96f97eb03e5867aa201ba847ae4a156d5fe67230a76568c28b88550 WatchSource:0}: Error finding container 4bdb5637d96f97eb03e5867aa201ba847ae4a156d5fe67230a76568c28b88550: Status 404 returned error can't find the container with id 4bdb5637d96f97eb03e5867aa201ba847ae4a156d5fe67230a76568c28b88550 Apr 16 23:55:55.954669 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:55.954640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mw6gk" event={"ID":"0716a640-755e-44e4-b055-492695e584f9","Type":"ContainerStarted","Data":"4bdb5637d96f97eb03e5867aa201ba847ae4a156d5fe67230a76568c28b88550"} Apr 16 23:55:56.958456 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:56.958424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mw6gk" event={"ID":"0716a640-755e-44e4-b055-492695e584f9","Type":"ContainerStarted","Data":"b7859bcfdc8615d13774e00be48967373f4e9d614fcb84b5ae9fcda68d752f07"} Apr 16 23:55:56.971851 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:55:56.971799 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-mw6gk" podStartSLOduration=1.971783708 podStartE2EDuration="1.971783708s" podCreationTimestamp="2026-04-16 23:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:55:56.971382723 +0000 UTC m=+341.517373305" watchObservedRunningTime="2026-04-16 23:55:56.971783708 +0000 UTC m=+341.517774291" Apr 16 23:56:12.512637 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.512557 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q"] Apr 16 23:56:12.515707 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.515683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.517740 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.517716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-nnn2n\"" Apr 16 23:56:12.517740 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.517733 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:56:12.517985 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.517737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:56:12.517985 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.517785 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:56:12.517985 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.517818 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:56:12.527391 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.527373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q"] Apr 16 23:56:12.599464 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.599438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.599592 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.599476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.599592 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.599508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cq9b\" (UniqueName: \"kubernetes.io/projected/2f2d0abb-045d-414c-a5d0-9ba3f291a473-kube-api-access-8cq9b\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.700710 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.700683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.700832 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.700718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.700832 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.700739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cq9b\" (UniqueName: \"kubernetes.io/projected/2f2d0abb-045d-414c-a5d0-9ba3f291a473-kube-api-access-8cq9b\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.703368 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.703341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.703449 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.703341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f2d0abb-045d-414c-a5d0-9ba3f291a473-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.710583 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.710564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cq9b\" (UniqueName: \"kubernetes.io/projected/2f2d0abb-045d-414c-a5d0-9ba3f291a473-kube-api-access-8cq9b\") pod \"opendatahub-operator-controller-manager-bf54d8685-9gm8q\" (UID: \"2f2d0abb-045d-414c-a5d0-9ba3f291a473\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.825598 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.825581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:12.947273 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.947240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q"] Apr 16 23:56:12.950402 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:56:12.950372 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2d0abb_045d_414c_a5d0_9ba3f291a473.slice/crio-ac7279eab897fe3cc1f67c14f2992442041bbdfe3f3c34413d7eed5439a630ad WatchSource:0}: Error finding container ac7279eab897fe3cc1f67c14f2992442041bbdfe3f3c34413d7eed5439a630ad: Status 404 returned error can't find the container with id ac7279eab897fe3cc1f67c14f2992442041bbdfe3f3c34413d7eed5439a630ad Apr 16 23:56:12.999803 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:12.999775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" event={"ID":"2f2d0abb-045d-414c-a5d0-9ba3f291a473","Type":"ContainerStarted","Data":"ac7279eab897fe3cc1f67c14f2992442041bbdfe3f3c34413d7eed5439a630ad"} Apr 16 23:56:16.009080 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:16.009047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" event={"ID":"2f2d0abb-045d-414c-a5d0-9ba3f291a473","Type":"ContainerStarted","Data":"f3d920f47121b4d6e8904545442a834ab7f1f12af71abac6f5e116eca7a6a9a2"} Apr 16 23:56:16.009506 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:16.009267 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:16.025840 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:16.025792 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" podStartSLOduration=1.5712847399999998 podStartE2EDuration="4.025780304s" podCreationTimestamp="2026-04-16 23:56:12 +0000 UTC" firstStartedPulling="2026-04-16 23:56:12.952138422 +0000 UTC m=+357.498128985" lastFinishedPulling="2026-04-16 23:56:15.406633989 +0000 UTC m=+359.952624549" observedRunningTime="2026-04-16 23:56:16.024789963 +0000 UTC m=+360.570780546" watchObservedRunningTime="2026-04-16 23:56:16.025780304 +0000 UTC m=+360.571770886" Apr 16 23:56:27.014041 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:27.014009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-9gm8q" Apr 16 23:56:33.381727 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.381692 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gcnvw"] Apr 16 23:56:33.385871 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.385855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:33.387542 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.387523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-8r4t6\"" Apr 16 23:56:33.387672 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.387577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 23:56:33.393128 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.393103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gcnvw"] Apr 16 23:56:33.543504 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.543468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnz8\" (UniqueName: \"kubernetes.io/projected/d46e57f6-8d63-4f84-9e7a-e5e61281e599-kube-api-access-rxnz8\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:33.543504 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.543504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:33.644728 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.644660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnz8\" (UniqueName: \"kubernetes.io/projected/d46e57f6-8d63-4f84-9e7a-e5e61281e599-kube-api-access-rxnz8\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:33.644728 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.644694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:33.644892 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:56:33.644807 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 23:56:33.644892 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:56:33.644865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert podName:d46e57f6-8d63-4f84-9e7a-e5e61281e599 nodeName:}" failed. No retries permitted until 2026-04-16 23:56:34.1448505 +0000 UTC m=+378.690841060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert") pod "odh-model-controller-858dbf95b8-gcnvw" (UID: "d46e57f6-8d63-4f84-9e7a-e5e61281e599") : secret "odh-model-controller-webhook-cert" not found Apr 16 23:56:33.652376 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:33.652342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnz8\" (UniqueName: \"kubernetes.io/projected/d46e57f6-8d63-4f84-9e7a-e5e61281e599-kube-api-access-rxnz8\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:34.147510 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:34.147473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:34.149965 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:34.149942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d46e57f6-8d63-4f84-9e7a-e5e61281e599-cert\") pod \"odh-model-controller-858dbf95b8-gcnvw\" (UID: \"d46e57f6-8d63-4f84-9e7a-e5e61281e599\") " pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:34.296109 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:34.296077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:34.410902 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:34.410826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gcnvw"] Apr 16 23:56:34.413639 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:56:34.413609 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46e57f6_8d63_4f84_9e7a_e5e61281e599.slice/crio-84c5ef62a1edb0a974e9cce4cacde5cd622a9612251e77c9b120a6ef0dd175bf WatchSource:0}: Error finding container 84c5ef62a1edb0a974e9cce4cacde5cd622a9612251e77c9b120a6ef0dd175bf: Status 404 returned error can't find the container with id 84c5ef62a1edb0a974e9cce4cacde5cd622a9612251e77c9b120a6ef0dd175bf Apr 16 23:56:35.057217 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:35.057175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" event={"ID":"d46e57f6-8d63-4f84-9e7a-e5e61281e599","Type":"ContainerStarted","Data":"84c5ef62a1edb0a974e9cce4cacde5cd622a9612251e77c9b120a6ef0dd175bf"} Apr 16 23:56:38.068031 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.067996 2576 generic.go:358] "Generic (PLEG): container finished" podID="d46e57f6-8d63-4f84-9e7a-e5e61281e599" containerID="1fd6869869bc1bc77daccf113861bede1dcfc0aadf59c467821e20af75655e31" exitCode=1 Apr 16 23:56:38.068374 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.068083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" event={"ID":"d46e57f6-8d63-4f84-9e7a-e5e61281e599","Type":"ContainerDied","Data":"1fd6869869bc1bc77daccf113861bede1dcfc0aadf59c467821e20af75655e31"} Apr 16 23:56:38.068374 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.068335 2576 scope.go:117] "RemoveContainer" containerID="1fd6869869bc1bc77daccf113861bede1dcfc0aadf59c467821e20af75655e31" Apr 16 23:56:38.359205 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.359182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vtshf"] Apr 16 23:56:38.362092 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.362075 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.366007 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.365987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 23:56:38.366007 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.365999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-tg29z\"" Apr 16 23:56:38.373340 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.373309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vtshf"] Apr 16 23:56:38.380445 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.380424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.380555 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.380453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tl7g\" (UniqueName: \"kubernetes.io/projected/e1c923b6-f354-48cb-971f-893ba483dc99-kube-api-access-9tl7g\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.480873 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.480841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.480873 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.480878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tl7g\" (UniqueName: \"kubernetes.io/projected/e1c923b6-f354-48cb-971f-893ba483dc99-kube-api-access-9tl7g\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.481094 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:56:38.481006 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 23:56:38.481094 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:56:38.481075 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert podName:e1c923b6-f354-48cb-971f-893ba483dc99 nodeName:}" failed. No retries permitted until 2026-04-16 23:56:38.981057706 +0000 UTC m=+383.527048267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert") pod "kserve-controller-manager-856948b99f-vtshf" (UID: "e1c923b6-f354-48cb-971f-893ba483dc99") : secret "kserve-webhook-server-cert" not found Apr 16 23:56:38.493723 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.493692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tl7g\" (UniqueName: \"kubernetes.io/projected/e1c923b6-f354-48cb-971f-893ba483dc99-kube-api-access-9tl7g\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.984453 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.984413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:38.986983 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:38.986964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c923b6-f354-48cb-971f-893ba483dc99-cert\") pod \"kserve-controller-manager-856948b99f-vtshf\" (UID: \"e1c923b6-f354-48cb-971f-893ba483dc99\") " pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:39.072097 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:39.072056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" event={"ID":"d46e57f6-8d63-4f84-9e7a-e5e61281e599","Type":"ContainerStarted","Data":"faa24e1c253a0394de5d341728f54e4584f8a8833bbb971b7bc89fec02a8ecf5"} Apr 16 23:56:39.072438 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:39.072182 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:39.095423 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:39.095380 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" podStartSLOduration=2.205803558 podStartE2EDuration="6.095367299s" podCreationTimestamp="2026-04-16 23:56:33 +0000 UTC" firstStartedPulling="2026-04-16 23:56:34.414814585 +0000 UTC m=+378.960805145" lastFinishedPulling="2026-04-16 23:56:38.304378323 +0000 UTC m=+382.850368886" observedRunningTime="2026-04-16 23:56:39.094200917 +0000 UTC m=+383.640191500" watchObservedRunningTime="2026-04-16 23:56:39.095367299 +0000 UTC m=+383.641357881" Apr 16 23:56:39.273163 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:39.273086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:39.402711 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:39.402676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vtshf"] Apr 16 23:56:39.406255 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:56:39.406228 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c923b6_f354_48cb_971f_893ba483dc99.slice/crio-fe1b7601adfe58786c91247bb4090263e1d3b241d799acfd815d10c4d3f11dc9 WatchSource:0}: Error finding container fe1b7601adfe58786c91247bb4090263e1d3b241d799acfd815d10c4d3f11dc9: Status 404 returned error can't find the container with id fe1b7601adfe58786c91247bb4090263e1d3b241d799acfd815d10c4d3f11dc9 Apr 16 23:56:40.075830 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.075793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" event={"ID":"e1c923b6-f354-48cb-971f-893ba483dc99","Type":"ContainerStarted","Data":"fe1b7601adfe58786c91247bb4090263e1d3b241d799acfd815d10c4d3f11dc9"} Apr 16 23:56:40.987313 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.987280 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs"] Apr 16 23:56:40.990178 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.990164 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:40.992433 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.992412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:56:40.992998 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.992977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 23:56:40.992998 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.992989 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:56:40.993134 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.992980 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 23:56:40.993134 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.992986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-chms8\"" Apr 16 23:56:40.998540 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.998512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:40.998623 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.998551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tmp\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:40.998668 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:40.998631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rp5s\" (UniqueName: \"kubernetes.io/projected/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-kube-api-access-4rp5s\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.001333 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.001312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs"] Apr 16 23:56:41.099462 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.099438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rp5s\" (UniqueName: \"kubernetes.io/projected/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-kube-api-access-4rp5s\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.099837 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.099472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.099837 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.099498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tmp\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.101835 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.101801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tmp\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.102307 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.102287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.106619 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.106596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rp5s\" (UniqueName: \"kubernetes.io/projected/2686cb8c-f64c-447d-bd8e-f25b04de1e6b-kube-api-access-4rp5s\") pod \"kube-auth-proxy-5cd78b4564-xxxhs\" (UID: \"2686cb8c-f64c-447d-bd8e-f25b04de1e6b\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.299542 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.299511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" Apr 16 23:56:41.428337 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:41.428303 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs"] Apr 16 23:56:41.897708 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:56:41.897670 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2686cb8c_f64c_447d_bd8e_f25b04de1e6b.slice/crio-71e75a8db7af1d8f7b7ebeab1e2b080212ac1b0bc5ae9022f0f1304e02ab30de WatchSource:0}: Error finding container 71e75a8db7af1d8f7b7ebeab1e2b080212ac1b0bc5ae9022f0f1304e02ab30de: Status 404 returned error can't find the container with id 71e75a8db7af1d8f7b7ebeab1e2b080212ac1b0bc5ae9022f0f1304e02ab30de Apr 16 23:56:42.081926 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:42.081875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" event={"ID":"2686cb8c-f64c-447d-bd8e-f25b04de1e6b","Type":"ContainerStarted","Data":"71e75a8db7af1d8f7b7ebeab1e2b080212ac1b0bc5ae9022f0f1304e02ab30de"} Apr 16 23:56:42.083068 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:42.083045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" event={"ID":"e1c923b6-f354-48cb-971f-893ba483dc99","Type":"ContainerStarted","Data":"8d53f60b17a81783bf2a3f21af1c8e9a320d4fd311c29720a17cd7985e75bbc4"} Apr 16 23:56:42.083222 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:42.083204 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:56:42.100783 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:42.100737 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" podStartSLOduration=1.565021191 podStartE2EDuration="4.100725524s" podCreationTimestamp="2026-04-16 23:56:38 +0000 UTC" firstStartedPulling="2026-04-16 23:56:39.40804525 +0000 UTC m=+383.954035811" lastFinishedPulling="2026-04-16 23:56:41.943749578 +0000 UTC m=+386.489740144" observedRunningTime="2026-04-16 23:56:42.098849093 +0000 UTC m=+386.644839675" watchObservedRunningTime="2026-04-16 23:56:42.100725524 +0000 UTC m=+386.646716127" Apr 16 23:56:45.093794 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:45.093757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" event={"ID":"2686cb8c-f64c-447d-bd8e-f25b04de1e6b","Type":"ContainerStarted","Data":"0d7f72296fc202e639897264f5b6a63f1721c26d70afcc418c555f49e709b8f6"} Apr 16 23:56:45.108119 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:45.108073 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-xxxhs" podStartSLOduration=2.5794286140000002 podStartE2EDuration="5.10805945s" podCreationTimestamp="2026-04-16 23:56:40 +0000 UTC" firstStartedPulling="2026-04-16 23:56:41.899494251 +0000 UTC m=+386.445484811" lastFinishedPulling="2026-04-16 23:56:44.428125073 +0000 UTC m=+388.974115647" observedRunningTime="2026-04-16 23:56:45.10659683 +0000 UTC m=+389.652587413" watchObservedRunningTime="2026-04-16 23:56:45.10805945 +0000 UTC m=+389.654050032" Apr 16 23:56:50.078398 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:50.078366 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-gcnvw" Apr 16 23:56:55.995483 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:55.995448 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq"] Apr 16 23:56:56.004501 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.004478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.006849 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.006824 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 23:56:56.007000 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.006827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-nq4vx\"" Apr 16 23:56:56.007000 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.006827 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 23:56:56.014254 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.014220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq"] Apr 16 23:56:56.114606 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.114575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cklkr\" (UniqueName: \"kubernetes.io/projected/7b85ff9a-a6af-4668-aec6-136f3131b9c2-kube-api-access-cklkr\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.114755 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.114611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7b85ff9a-a6af-4668-aec6-136f3131b9c2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.215766 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.215738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cklkr\" (UniqueName: \"kubernetes.io/projected/7b85ff9a-a6af-4668-aec6-136f3131b9c2-kube-api-access-cklkr\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.215906 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.215774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7b85ff9a-a6af-4668-aec6-136f3131b9c2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.218406 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.218375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7b85ff9a-a6af-4668-aec6-136f3131b9c2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.224242 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.224206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cklkr\" (UniqueName: \"kubernetes.io/projected/7b85ff9a-a6af-4668-aec6-136f3131b9c2-kube-api-access-cklkr\") pod \"servicemesh-operator3-55f49c5f94-gcbcq\" (UID: \"7b85ff9a-a6af-4668-aec6-136f3131b9c2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.315658 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.315631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:56:56.434576 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:56.434544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq"] Apr 16 23:56:56.437540 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:56:56.437512 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b85ff9a_a6af_4668_aec6_136f3131b9c2.slice/crio-f54960925e3db18af869dcb453c9c624e39a9c6ac4136e087ff74bfec15ea1d7 WatchSource:0}: Error finding container f54960925e3db18af869dcb453c9c624e39a9c6ac4136e087ff74bfec15ea1d7: Status 404 returned error can't find the container with id f54960925e3db18af869dcb453c9c624e39a9c6ac4136e087ff74bfec15ea1d7 Apr 16 23:56:57.131328 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:56:57.131292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" event={"ID":"7b85ff9a-a6af-4668-aec6-136f3131b9c2","Type":"ContainerStarted","Data":"f54960925e3db18af869dcb453c9c624e39a9c6ac4136e087ff74bfec15ea1d7"} Apr 16 23:57:02.149295 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:02.149253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" event={"ID":"7b85ff9a-a6af-4668-aec6-136f3131b9c2","Type":"ContainerStarted","Data":"7753c0b02c0f7f566e76c6a14dcd42816dc3d2dd49a5de9a93c6568a44f679f9"} Apr 16 23:57:02.149680 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:02.149386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:57:02.167869 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:02.167826 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" podStartSLOduration=2.330188373 podStartE2EDuration="7.167814893s" podCreationTimestamp="2026-04-16 23:56:55 +0000 UTC" firstStartedPulling="2026-04-16 23:56:56.439971852 +0000 UTC m=+400.985962412" lastFinishedPulling="2026-04-16 23:57:01.277598368 +0000 UTC m=+405.823588932" observedRunningTime="2026-04-16 23:57:02.166946072 +0000 UTC m=+406.712936651" watchObservedRunningTime="2026-04-16 23:57:02.167814893 +0000 UTC m=+406.713805475" Apr 16 23:57:12.579263 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.579203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4"] Apr 16 23:57:12.586161 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.586130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.588095 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.588069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 23:57:12.588095 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.588098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-j9dxs\"" Apr 16 23:57:12.588280 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.588071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 23:57:12.588525 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.588509 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 23:57:12.588863 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.588842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 23:57:12.591108 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.591088 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4"] Apr 16 23:57:12.629989 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.629961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5dm\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-kube-api-access-tq5dm\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630109 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.629998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630109 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.630022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630188 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.630117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630188 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.630155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630252 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.630207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c588d5fa-35ce-4320-94e8-2b3c03dae725-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.630252 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.630224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730824 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c588d5fa-35ce-4320-94e8-2b3c03dae725-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.730962 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.731229 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.730987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5dm\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-kube-api-access-tq5dm\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.731480 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.731452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.733467 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.733431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c588d5fa-35ce-4320-94e8-2b3c03dae725-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.733697 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.733671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.733788 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.733767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.733841 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.733799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.738227 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.738201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.738450 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.738434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5dm\" (UniqueName: \"kubernetes.io/projected/c588d5fa-35ce-4320-94e8-2b3c03dae725-kube-api-access-tq5dm\") pod \"istiod-openshift-gateway-55ff986f96-kckq4\" (UID: \"c588d5fa-35ce-4320-94e8-2b3c03dae725\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:12.896801 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:12.896715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:13.023774 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:13.023651 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4"] Apr 16 23:57:13.030483 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:57:13.026613 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc588d5fa_35ce_4320_94e8_2b3c03dae725.slice/crio-1fad15273339289171937ff37f841be08afab3fca90fe056fa00ca4501d13889 WatchSource:0}: Error finding container 1fad15273339289171937ff37f841be08afab3fca90fe056fa00ca4501d13889: Status 404 returned error can't find the container with id 1fad15273339289171937ff37f841be08afab3fca90fe056fa00ca4501d13889 Apr 16 23:57:13.091814 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:13.091788 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-vtshf" Apr 16 23:57:13.154184 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:13.154104 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gcbcq" Apr 16 23:57:13.183359 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:13.183327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" event={"ID":"c588d5fa-35ce-4320-94e8-2b3c03dae725","Type":"ContainerStarted","Data":"1fad15273339289171937ff37f841be08afab3fca90fe056fa00ca4501d13889"} Apr 16 23:57:16.159686 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:16.159649 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 23:57:16.159961 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:16.159716 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 23:57:17.202092 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:17.202051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" event={"ID":"c588d5fa-35ce-4320-94e8-2b3c03dae725","Type":"ContainerStarted","Data":"c8c003c612bf7e3b88b4e8477ca3c84b04fa408a663bf95cae2e24ce7606bf10"} Apr 16 23:57:17.202612 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:17.202580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:57:17.204571 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:17.204528 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-kckq4 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 23:57:17.204691 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:17.204605 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" podUID="c588d5fa-35ce-4320-94e8-2b3c03dae725" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:57:17.222970 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:17.222893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" podStartSLOduration=2.096245916 podStartE2EDuration="5.222878131s" podCreationTimestamp="2026-04-16 23:57:12 +0000 UTC" firstStartedPulling="2026-04-16 23:57:13.032834054 +0000 UTC m=+417.578824620" lastFinishedPulling="2026-04-16 23:57:16.159466275 +0000 UTC m=+420.705456835" observedRunningTime="2026-04-16 23:57:17.221047467 +0000 UTC m=+421.767038060" watchObservedRunningTime="2026-04-16 23:57:17.222878131 +0000 UTC m=+421.768868714" Apr 16 23:57:18.206157 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:57:18.206131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kckq4" Apr 16 23:58:09.737340 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.737301 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-w2w4s"] Apr 16 23:58:09.740252 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.740236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:09.742627 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.742599 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:58:09.742627 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.742623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:58:09.742806 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.742665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-fqwgv\"" Apr 16 23:58:09.750043 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.750015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-w2w4s"] Apr 16 23:58:09.849075 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.849042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5jv\" (UniqueName: \"kubernetes.io/projected/925fcd4a-e982-4f90-b9a1-99b50b7e6140-kube-api-access-tr5jv\") pod \"authorino-operator-657f44b778-w2w4s\" (UID: \"925fcd4a-e982-4f90-b9a1-99b50b7e6140\") " pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:09.950124 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.950092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5jv\" (UniqueName: \"kubernetes.io/projected/925fcd4a-e982-4f90-b9a1-99b50b7e6140-kube-api-access-tr5jv\") pod \"authorino-operator-657f44b778-w2w4s\" (UID: \"925fcd4a-e982-4f90-b9a1-99b50b7e6140\") " pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:09.957316 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:09.957287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5jv\" (UniqueName: \"kubernetes.io/projected/925fcd4a-e982-4f90-b9a1-99b50b7e6140-kube-api-access-tr5jv\") pod \"authorino-operator-657f44b778-w2w4s\" (UID: \"925fcd4a-e982-4f90-b9a1-99b50b7e6140\") " pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:10.065553 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:10.065532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:10.188363 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:10.188334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-w2w4s"] Apr 16 23:58:10.191205 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:58:10.191175 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925fcd4a_e982_4f90_b9a1_99b50b7e6140.slice/crio-4dec930cef97a2795bfc5b6320e579adfde26c7a2f6250a83a6920ad7a250c00 WatchSource:0}: Error finding container 4dec930cef97a2795bfc5b6320e579adfde26c7a2f6250a83a6920ad7a250c00: Status 404 returned error can't find the container with id 4dec930cef97a2795bfc5b6320e579adfde26c7a2f6250a83a6920ad7a250c00 Apr 16 23:58:10.360806 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:10.360727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" event={"ID":"925fcd4a-e982-4f90-b9a1-99b50b7e6140","Type":"ContainerStarted","Data":"4dec930cef97a2795bfc5b6320e579adfde26c7a2f6250a83a6920ad7a250c00"} Apr 16 23:58:12.368875 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:12.368841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" event={"ID":"925fcd4a-e982-4f90-b9a1-99b50b7e6140","Type":"ContainerStarted","Data":"583919f95d734f467ccd24bcdc798ee602aff0be714729796717146a43919a34"} Apr 16 23:58:12.369311 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:12.369023 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:58:12.386702 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:12.386658 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" podStartSLOduration=1.93408824 podStartE2EDuration="3.38664738s" podCreationTimestamp="2026-04-16 23:58:09 +0000 UTC" firstStartedPulling="2026-04-16 23:58:10.193590061 +0000 UTC m=+474.739580625" lastFinishedPulling="2026-04-16 23:58:11.6461492 +0000 UTC m=+476.192139765" observedRunningTime="2026-04-16 23:58:12.385279391 +0000 UTC m=+476.931269972" watchObservedRunningTime="2026-04-16 23:58:12.38664738 +0000 UTC m=+476.932637961" Apr 16 23:58:23.374656 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:58:23.374630 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-w2w4s" Apr 16 23:59:25.269950 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.269897 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:25.273008 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.272991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.275023 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.275001 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 23:59:25.275145 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.275011 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-skvbv\"" Apr 16 23:59:25.285428 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.285408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:25.363707 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.363677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.363895 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.363740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h447d\" (UniqueName: \"kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.369570 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.369537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:25.464675 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.464645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h447d\" (UniqueName: \"kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.464828 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.464688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.465263 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.465242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.471519 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.471500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h447d\" (UniqueName: \"kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d\") pod \"limitador-limitador-7d549b5b-p92k9\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.583335 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.583303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:25.705843 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:25.705818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:25.708358 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:59:25.708330 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d80c8a3_ac90_443a_ad71_bebfe397447c.slice/crio-22bb80aed903b72840a056d624593d88436a284da2f456c0b1c24827d3c199e9 WatchSource:0}: Error finding container 22bb80aed903b72840a056d624593d88436a284da2f456c0b1c24827d3c199e9: Status 404 returned error can't find the container with id 22bb80aed903b72840a056d624593d88436a284da2f456c0b1c24827d3c199e9 Apr 16 23:59:26.600366 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:26.600313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" event={"ID":"8d80c8a3-ac90-443a-ad71-bebfe397447c","Type":"ContainerStarted","Data":"22bb80aed903b72840a056d624593d88436a284da2f456c0b1c24827d3c199e9"} Apr 16 23:59:28.608423 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:28.608352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" event={"ID":"8d80c8a3-ac90-443a-ad71-bebfe397447c","Type":"ContainerStarted","Data":"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1"} Apr 16 23:59:28.608741 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:28.608496 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:28.622106 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:28.622057 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" podStartSLOduration=1.009416443 podStartE2EDuration="3.622039952s" podCreationTimestamp="2026-04-16 23:59:25 +0000 UTC" firstStartedPulling="2026-04-16 23:59:25.710115882 +0000 UTC m=+550.256106442" lastFinishedPulling="2026-04-16 23:59:28.322739388 +0000 UTC m=+552.868729951" observedRunningTime="2026-04-16 23:59:28.621417672 +0000 UTC m=+553.167408255" watchObservedRunningTime="2026-04-16 23:59:28.622039952 +0000 UTC m=+553.168030535" Apr 16 23:59:39.613139 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:39.613101 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:39.856957 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:39.856905 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:39.857145 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:39.857123 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" podUID="8d80c8a3-ac90-443a-ad71-bebfe397447c" containerName="limitador" containerID="cri-o://c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1" gracePeriod=30 Apr 16 23:59:40.403167 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.403145 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:40.477567 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.477495 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file\") pod \"8d80c8a3-ac90-443a-ad71-bebfe397447c\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " Apr 16 23:59:40.477567 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.477544 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h447d\" (UniqueName: \"kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d\") pod \"8d80c8a3-ac90-443a-ad71-bebfe397447c\" (UID: \"8d80c8a3-ac90-443a-ad71-bebfe397447c\") " Apr 16 23:59:40.477833 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.477811 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file" (OuterVolumeSpecName: "config-file") pod "8d80c8a3-ac90-443a-ad71-bebfe397447c" (UID: "8d80c8a3-ac90-443a-ad71-bebfe397447c"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:59:40.479752 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.479721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d" (OuterVolumeSpecName: "kube-api-access-h447d") pod "8d80c8a3-ac90-443a-ad71-bebfe397447c" (UID: "8d80c8a3-ac90-443a-ad71-bebfe397447c"). InnerVolumeSpecName "kube-api-access-h447d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:59:40.578414 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.578382 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d80c8a3-ac90-443a-ad71-bebfe397447c-config-file\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:59:40.578414 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.578408 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h447d\" (UniqueName: \"kubernetes.io/projected/8d80c8a3-ac90-443a-ad71-bebfe397447c-kube-api-access-h447d\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 16 23:59:40.647854 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.647823 2576 generic.go:358] "Generic (PLEG): container finished" podID="8d80c8a3-ac90-443a-ad71-bebfe397447c" containerID="c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1" exitCode=0 Apr 16 23:59:40.648245 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.647884 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" Apr 16 23:59:40.648245 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.647937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" event={"ID":"8d80c8a3-ac90-443a-ad71-bebfe397447c","Type":"ContainerDied","Data":"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1"} Apr 16 23:59:40.648245 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.647973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-p92k9" event={"ID":"8d80c8a3-ac90-443a-ad71-bebfe397447c","Type":"ContainerDied","Data":"22bb80aed903b72840a056d624593d88436a284da2f456c0b1c24827d3c199e9"} Apr 16 23:59:40.648245 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.647989 2576 scope.go:117] "RemoveContainer" containerID="c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1" Apr 16 23:59:40.655677 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.655660 2576 scope.go:117] "RemoveContainer" containerID="c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1" Apr 16 23:59:40.655912 ip-10-0-133-231 kubenswrapper[2576]: E0416 23:59:40.655895 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1\": container with ID starting with c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1 not found: ID does not exist" containerID="c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1" Apr 16 23:59:40.656012 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.655935 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1"} err="failed to get container status \"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1\": rpc error: code = NotFound desc = could not find container \"c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1\": container with ID starting with c246cfafad7c1099ac050e39de4c9a6b14bc5f9b19a4439b8b88a16aac2b06e1 not found: ID does not exist" Apr 16 23:59:40.666337 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.666313 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:40.670801 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:40.670779 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-p92k9"] Apr 16 23:59:41.980691 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:41.980656 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d80c8a3-ac90-443a-ad71-bebfe397447c" path="/var/lib/kubelet/pods/8d80c8a3-ac90-443a-ad71-bebfe397447c/volumes" Apr 16 23:59:43.114029 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.113994 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-rdrwh"] Apr 16 23:59:43.114412 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.114300 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d80c8a3-ac90-443a-ad71-bebfe397447c" containerName="limitador" Apr 16 23:59:43.114412 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.114316 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d80c8a3-ac90-443a-ad71-bebfe397447c" containerName="limitador" Apr 16 23:59:43.114412 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.114370 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d80c8a3-ac90-443a-ad71-bebfe397447c" containerName="limitador" Apr 16 23:59:43.118502 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.118478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.120362 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.120339 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 23:59:43.120504 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.120428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-ctt2d\"" Apr 16 23:59:43.125250 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.125200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rdrwh"] Apr 16 23:59:43.198110 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.198081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfzq\" (UniqueName: \"kubernetes.io/projected/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-kube-api-access-bqfzq\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.198243 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.198164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-data\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.298676 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.298629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-data\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.298803 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.298711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfzq\" (UniqueName: \"kubernetes.io/projected/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-kube-api-access-bqfzq\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.299037 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.299018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-data\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.306123 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.306096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfzq\" (UniqueName: \"kubernetes.io/projected/46fd5449-3f1d-4e95-9bd4-4d837df7c0ba-kube-api-access-bqfzq\") pod \"postgres-868db5846d-rdrwh\" (UID: \"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba\") " pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.430438 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.430374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:43.547008 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.546982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rdrwh"] Apr 16 23:59:43.549541 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:59:43.549508 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46fd5449_3f1d_4e95_9bd4_4d837df7c0ba.slice/crio-f3d8b4e39676b368c74c32a48171bc5a386ed068086a2b1322f22922f849b663 WatchSource:0}: Error finding container f3d8b4e39676b368c74c32a48171bc5a386ed068086a2b1322f22922f849b663: Status 404 returned error can't find the container with id f3d8b4e39676b368c74c32a48171bc5a386ed068086a2b1322f22922f849b663 Apr 16 23:59:43.664201 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:43.664170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rdrwh" event={"ID":"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba","Type":"ContainerStarted","Data":"f3d8b4e39676b368c74c32a48171bc5a386ed068086a2b1322f22922f849b663"} Apr 16 23:59:48.684610 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:48.684571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rdrwh" event={"ID":"46fd5449-3f1d-4e95-9bd4-4d837df7c0ba","Type":"ContainerStarted","Data":"094d0e444bd0823f28347822ccc1e56a803a0d429e72ec0ef5240b75669a67d1"} Apr 16 23:59:48.685052 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:48.684624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:48.699177 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:48.699100 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-rdrwh" podStartSLOduration=0.811482781 podStartE2EDuration="5.699085886s" podCreationTimestamp="2026-04-16 23:59:43 +0000 UTC" firstStartedPulling="2026-04-16 23:59:43.550804567 +0000 UTC m=+568.096795131" lastFinishedPulling="2026-04-16 23:59:48.438407676 +0000 UTC m=+572.984398236" observedRunningTime="2026-04-16 23:59:48.697454718 +0000 UTC m=+573.243445300" watchObservedRunningTime="2026-04-16 23:59:48.699085886 +0000 UTC m=+573.245076468" Apr 16 23:59:54.715855 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:54.715829 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-rdrwh" Apr 16 23:59:55.527697 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.527668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 16 23:59:55.533197 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.533180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.535312 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.535291 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 23:59:55.535383 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.535302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6fjhg\"" Apr 16 23:59:55.535383 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.535362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 23:59:55.541164 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.541139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 16 23:59:55.551112 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.551089 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 16 23:59:55.554586 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.554569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 16 23:59:55.556567 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.556545 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-f2s24\"" Apr 16 23:59:55.563841 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.563819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 16 23:59:55.703204 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.703168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.703400 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.703213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxq2\" (UniqueName: \"kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.703400 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.703302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsn5d\" (UniqueName: \"kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d\") pod \"maas-controller-7446c54ff4-5pq84\" (UID: \"96924fc3-b273-4f2d-80ae-329a8e077194\") " pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 16 23:59:55.804223 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.803766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsn5d\" (UniqueName: \"kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d\") pod \"maas-controller-7446c54ff4-5pq84\" (UID: \"96924fc3-b273-4f2d-80ae-329a8e077194\") " pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 16 23:59:55.804223 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.803862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.804223 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.803890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxq2\" (UniqueName: \"kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.807044 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.806995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.811861 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.811833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsn5d\" (UniqueName: \"kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d\") pod \"maas-controller-7446c54ff4-5pq84\" (UID: \"96924fc3-b273-4f2d-80ae-329a8e077194\") " pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 16 23:59:55.812242 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.812222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxq2\" (UniqueName: \"kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2\") pod \"maas-api-8c469dc49-gb5dm\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.844320 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.844296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 16 23:59:55.867232 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.867208 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 16 23:59:55.981437 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:59:55.981411 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30b6f764_0001_4a0d_b357_241415ad798d.slice/crio-69d49e14103955a3324bda3d1e5d81012945f6ee11007a63fd246456bca79ae6 WatchSource:0}: Error finding container 69d49e14103955a3324bda3d1e5d81012945f6ee11007a63fd246456bca79ae6: Status 404 returned error can't find the container with id 69d49e14103955a3324bda3d1e5d81012945f6ee11007a63fd246456bca79ae6 Apr 16 23:59:55.982030 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:55.982004 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 16 23:59:56.002611 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.002593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 16 23:59:56.004728 ip-10-0-133-231 kubenswrapper[2576]: W0416 23:59:56.004697 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96924fc3_b273_4f2d_80ae_329a8e077194.slice/crio-2f8a961af5e55ad0638cf9809077ad46e70d7532426dc30575f955c191800342 WatchSource:0}: Error finding container 2f8a961af5e55ad0638cf9809077ad46e70d7532426dc30575f955c191800342: Status 404 returned error can't find the container with id 2f8a961af5e55ad0638cf9809077ad46e70d7532426dc30575f955c191800342 Apr 16 23:59:56.413455 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.413423 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-9b67c8845-hhnj6"] Apr 16 23:59:56.418341 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.418318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.428252 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.428229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9b67c8845-hhnj6"] Apr 16 23:59:56.610823 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.610787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbmr\" (UniqueName: \"kubernetes.io/projected/262af08c-4532-4a8c-b1c5-5919af6414aa-kube-api-access-nqbmr\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.611025 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.610973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/262af08c-4532-4a8c-b1c5-5919af6414aa-maas-api-tls\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.711681 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.711592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/262af08c-4532-4a8c-b1c5-5919af6414aa-maas-api-tls\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.711842 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.711793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbmr\" (UniqueName: \"kubernetes.io/projected/262af08c-4532-4a8c-b1c5-5919af6414aa-kube-api-access-nqbmr\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.714607 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.714573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8c469dc49-gb5dm" event={"ID":"30b6f764-0001-4a0d-b357-241415ad798d","Type":"ContainerStarted","Data":"69d49e14103955a3324bda3d1e5d81012945f6ee11007a63fd246456bca79ae6"} Apr 16 23:59:56.715001 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.714979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/262af08c-4532-4a8c-b1c5-5919af6414aa-maas-api-tls\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.718526 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.718494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7446c54ff4-5pq84" event={"ID":"96924fc3-b273-4f2d-80ae-329a8e077194","Type":"ContainerStarted","Data":"2f8a961af5e55ad0638cf9809077ad46e70d7532426dc30575f955c191800342"} Apr 16 23:59:56.721111 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.721087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbmr\" (UniqueName: \"kubernetes.io/projected/262af08c-4532-4a8c-b1c5-5919af6414aa-kube-api-access-nqbmr\") pod \"maas-api-9b67c8845-hhnj6\" (UID: \"262af08c-4532-4a8c-b1c5-5919af6414aa\") " pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.728665 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.728644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 16 23:59:56.907821 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:56.907553 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9b67c8845-hhnj6"] Apr 16 23:59:57.724399 ip-10-0-133-231 kubenswrapper[2576]: I0416 23:59:57.724334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9b67c8845-hhnj6" event={"ID":"262af08c-4532-4a8c-b1c5-5919af6414aa","Type":"ContainerStarted","Data":"4d557d2825a4225c207666e0ce6333beb48070856cd01ef4ccfdc023b5c75048"} Apr 17 00:00:00.735937 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.735895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9b67c8845-hhnj6" event={"ID":"262af08c-4532-4a8c-b1c5-5919af6414aa","Type":"ContainerStarted","Data":"4571c6ec8f19fa08ea4d192d97b88a2f593d380aa176da13f34c3a81359a035b"} Apr 17 00:00:00.736381 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.736082 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 17 00:00:00.737196 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.737167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7446c54ff4-5pq84" event={"ID":"96924fc3-b273-4f2d-80ae-329a8e077194","Type":"ContainerStarted","Data":"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf"} Apr 17 00:00:00.737313 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.737219 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 17 00:00:00.738263 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.738235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8c469dc49-gb5dm" event={"ID":"30b6f764-0001-4a0d-b357-241415ad798d","Type":"ContainerStarted","Data":"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db"} Apr 17 00:00:00.738414 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.738369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 17 00:00:00.749995 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.749948 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-9b67c8845-hhnj6" podStartSLOduration=1.9044026 podStartE2EDuration="4.749935315s" podCreationTimestamp="2026-04-16 23:59:56 +0000 UTC" firstStartedPulling="2026-04-16 23:59:56.917829588 +0000 UTC m=+581.463820167" lastFinishedPulling="2026-04-16 23:59:59.763362321 +0000 UTC m=+584.309352882" observedRunningTime="2026-04-17 00:00:00.748848294 +0000 UTC m=+585.294838875" watchObservedRunningTime="2026-04-17 00:00:00.749935315 +0000 UTC m=+585.295925891" Apr 17 00:00:00.762361 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.762323 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7446c54ff4-5pq84" podStartSLOduration=2.010861205 podStartE2EDuration="5.762312709s" podCreationTimestamp="2026-04-16 23:59:55 +0000 UTC" firstStartedPulling="2026-04-16 23:59:56.005952392 +0000 UTC m=+580.551942953" lastFinishedPulling="2026-04-16 23:59:59.757403883 +0000 UTC m=+584.303394457" observedRunningTime="2026-04-17 00:00:00.761529908 +0000 UTC m=+585.307520491" watchObservedRunningTime="2026-04-17 00:00:00.762312709 +0000 UTC m=+585.308303294" Apr 17 00:00:00.775460 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:00.775415 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-8c469dc49-gb5dm" podStartSLOduration=2.000340423 podStartE2EDuration="5.775405125s" podCreationTimestamp="2026-04-16 23:59:55 +0000 UTC" firstStartedPulling="2026-04-16 23:59:55.982358615 +0000 UTC m=+580.528349176" lastFinishedPulling="2026-04-16 23:59:59.757423315 +0000 UTC m=+584.303413878" observedRunningTime="2026-04-17 00:00:00.773782404 +0000 UTC m=+585.319772976" watchObservedRunningTime="2026-04-17 00:00:00.775405125 +0000 UTC m=+585.321395706" Apr 17 00:00:06.747709 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:06.747679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-9b67c8845-hhnj6" Apr 17 00:00:06.748160 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:06.748141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 17 00:00:06.799723 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:06.799678 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 17 00:00:06.799892 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:06.799869 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-8c469dc49-gb5dm" podUID="30b6f764-0001-4a0d-b357-241415ad798d" containerName="maas-api" containerID="cri-o://3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db" gracePeriod=30 Apr 17 00:00:07.044740 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.044716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 17 00:00:07.078492 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.078459 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlxq2\" (UniqueName: \"kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2\") pod \"30b6f764-0001-4a0d-b357-241415ad798d\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " Apr 17 00:00:07.080739 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.080709 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2" (OuterVolumeSpecName: "kube-api-access-vlxq2") pod "30b6f764-0001-4a0d-b357-241415ad798d" (UID: "30b6f764-0001-4a0d-b357-241415ad798d"). InnerVolumeSpecName "kube-api-access-vlxq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:00:07.179491 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.179464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls\") pod \"30b6f764-0001-4a0d-b357-241415ad798d\" (UID: \"30b6f764-0001-4a0d-b357-241415ad798d\") " Apr 17 00:00:07.179644 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.179632 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vlxq2\" (UniqueName: \"kubernetes.io/projected/30b6f764-0001-4a0d-b357-241415ad798d-kube-api-access-vlxq2\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 17 00:00:07.181381 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.181361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "30b6f764-0001-4a0d-b357-241415ad798d" (UID: "30b6f764-0001-4a0d-b357-241415ad798d"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 00:00:07.280403 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.280348 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/30b6f764-0001-4a0d-b357-241415ad798d-maas-api-tls\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 17 00:00:07.760663 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.760623 2576 generic.go:358] "Generic (PLEG): container finished" podID="30b6f764-0001-4a0d-b357-241415ad798d" containerID="3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db" exitCode=0 Apr 17 00:00:07.761081 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.760692 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8c469dc49-gb5dm" Apr 17 00:00:07.761081 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.760712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8c469dc49-gb5dm" event={"ID":"30b6f764-0001-4a0d-b357-241415ad798d","Type":"ContainerDied","Data":"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db"} Apr 17 00:00:07.761081 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.760757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8c469dc49-gb5dm" event={"ID":"30b6f764-0001-4a0d-b357-241415ad798d","Type":"ContainerDied","Data":"69d49e14103955a3324bda3d1e5d81012945f6ee11007a63fd246456bca79ae6"} Apr 17 00:00:07.761081 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.760773 2576 scope.go:117] "RemoveContainer" containerID="3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db" Apr 17 00:00:07.768804 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.768785 2576 scope.go:117] "RemoveContainer" containerID="3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db" Apr 17 00:00:07.769108 ip-10-0-133-231 kubenswrapper[2576]: E0417 00:00:07.769091 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db\": container with ID starting with 3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db not found: ID does not exist" containerID="3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db" Apr 17 00:00:07.769171 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.769116 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db"} err="failed to get container status \"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db\": rpc error: code = NotFound desc = could not find container \"3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db\": container with ID starting with 3eca5e645d5bd75e8b1469fba7a20b1f0a6b9da35dc45ec39223e5f240b021db not found: ID does not exist" Apr 17 00:00:07.778904 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.778877 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 17 00:00:07.782614 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.782593 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-8c469dc49-gb5dm"] Apr 17 00:00:07.981448 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:07.981409 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b6f764-0001-4a0d-b357-241415ad798d" path="/var/lib/kubelet/pods/30b6f764-0001-4a0d-b357-241415ad798d/volumes" Apr 17 00:00:11.747821 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:11.747791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 17 00:00:12.027610 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.027523 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:00:12.027890 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.027871 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b6f764-0001-4a0d-b357-241415ad798d" containerName="maas-api" Apr 17 00:00:12.028011 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.027893 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b6f764-0001-4a0d-b357-241415ad798d" containerName="maas-api" Apr 17 00:00:12.028070 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.028012 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b6f764-0001-4a0d-b357-241415ad798d" containerName="maas-api" Apr 17 00:00:12.032330 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.032307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:12.036731 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.036708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:00:12.112082 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.112055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnmmv\" (UniqueName: \"kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv\") pod \"maas-controller-f8d44945c-vkmdf\" (UID: \"e1ba7f97-4062-44e0-961f-ed1407be1035\") " pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:12.212754 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.212717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnmmv\" (UniqueName: \"kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv\") pod \"maas-controller-f8d44945c-vkmdf\" (UID: \"e1ba7f97-4062-44e0-961f-ed1407be1035\") " pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:12.220401 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.220378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnmmv\" (UniqueName: \"kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv\") pod \"maas-controller-f8d44945c-vkmdf\" (UID: \"e1ba7f97-4062-44e0-961f-ed1407be1035\") " pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:12.343366 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.343338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:12.465455 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.465423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:00:12.468759 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:00:12.468726 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ba7f97_4062_44e0_961f_ed1407be1035.slice/crio-9fccbfc7a9ffaacfd32da54b766658dec2b3592efde767d0e5877bbbe87645a9 WatchSource:0}: Error finding container 9fccbfc7a9ffaacfd32da54b766658dec2b3592efde767d0e5877bbbe87645a9: Status 404 returned error can't find the container with id 9fccbfc7a9ffaacfd32da54b766658dec2b3592efde767d0e5877bbbe87645a9 Apr 17 00:00:12.777416 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:12.777382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-vkmdf" event={"ID":"e1ba7f97-4062-44e0-961f-ed1407be1035","Type":"ContainerStarted","Data":"9fccbfc7a9ffaacfd32da54b766658dec2b3592efde767d0e5877bbbe87645a9"} Apr 17 00:00:13.781962 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:13.781903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-vkmdf" event={"ID":"e1ba7f97-4062-44e0-961f-ed1407be1035","Type":"ContainerStarted","Data":"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92"} Apr 17 00:00:13.782310 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:13.782029 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:13.799538 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:13.799487 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f8d44945c-vkmdf" podStartSLOduration=1.501900727 podStartE2EDuration="1.799474181s" podCreationTimestamp="2026-04-17 00:00:12 +0000 UTC" firstStartedPulling="2026-04-17 00:00:12.470092738 +0000 UTC m=+597.016083301" lastFinishedPulling="2026-04-17 00:00:12.767666195 +0000 UTC m=+597.313656755" observedRunningTime="2026-04-17 00:00:13.798184473 +0000 UTC m=+598.344175054" watchObservedRunningTime="2026-04-17 00:00:13.799474181 +0000 UTC m=+598.345464764" Apr 17 00:00:15.927179 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:15.927154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:00:15.927179 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:15.927165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:00:24.791436 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:24.791401 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:00:24.824383 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:24.824352 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 17 00:00:24.824585 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:24.824566 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7446c54ff4-5pq84" podUID="96924fc3-b273-4f2d-80ae-329a8e077194" containerName="manager" containerID="cri-o://c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf" gracePeriod=10 Apr 17 00:00:25.062607 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.062584 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 17 00:00:25.092844 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.092814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsn5d\" (UniqueName: \"kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d\") pod \"96924fc3-b273-4f2d-80ae-329a8e077194\" (UID: \"96924fc3-b273-4f2d-80ae-329a8e077194\") " Apr 17 00:00:25.094951 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.094888 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d" (OuterVolumeSpecName: "kube-api-access-lsn5d") pod "96924fc3-b273-4f2d-80ae-329a8e077194" (UID: "96924fc3-b273-4f2d-80ae-329a8e077194"). InnerVolumeSpecName "kube-api-access-lsn5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:00:25.194056 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.194030 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lsn5d\" (UniqueName: \"kubernetes.io/projected/96924fc3-b273-4f2d-80ae-329a8e077194-kube-api-access-lsn5d\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 17 00:00:25.823481 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.823443 2576 generic.go:358] "Generic (PLEG): container finished" podID="96924fc3-b273-4f2d-80ae-329a8e077194" containerID="c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf" exitCode=0 Apr 17 00:00:25.823950 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.823500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7446c54ff4-5pq84" event={"ID":"96924fc3-b273-4f2d-80ae-329a8e077194","Type":"ContainerDied","Data":"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf"} Apr 17 00:00:25.823950 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.823539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7446c54ff4-5pq84" event={"ID":"96924fc3-b273-4f2d-80ae-329a8e077194","Type":"ContainerDied","Data":"2f8a961af5e55ad0638cf9809077ad46e70d7532426dc30575f955c191800342"} Apr 17 00:00:25.823950 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.823554 2576 scope.go:117] "RemoveContainer" containerID="c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf" Apr 17 00:00:25.823950 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.823509 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7446c54ff4-5pq84" Apr 17 00:00:25.834322 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.833891 2576 scope.go:117] "RemoveContainer" containerID="c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf" Apr 17 00:00:25.834768 ip-10-0-133-231 kubenswrapper[2576]: E0417 00:00:25.834739 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf\": container with ID starting with c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf not found: ID does not exist" containerID="c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf" Apr 17 00:00:25.834871 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.834776 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf"} err="failed to get container status \"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf\": rpc error: code = NotFound desc = could not find container \"c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf\": container with ID starting with c9cce34a2d6ec1e41905c8729cd38a538e4d3c6647d32935548990d166d881cf not found: ID does not exist" Apr 17 00:00:25.847043 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.847020 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 17 00:00:25.858646 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.858608 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7446c54ff4-5pq84"] Apr 17 00:00:25.981298 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:25.981267 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96924fc3-b273-4f2d-80ae-329a8e077194" path="/var/lib/kubelet/pods/96924fc3-b273-4f2d-80ae-329a8e077194/volumes" Apr 17 00:00:32.144427 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.144387 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st"] Apr 17 00:00:32.145015 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.144823 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96924fc3-b273-4f2d-80ae-329a8e077194" containerName="manager" Apr 17 00:00:32.145015 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.144841 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="96924fc3-b273-4f2d-80ae-329a8e077194" containerName="manager" Apr 17 00:00:32.145015 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.144954 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="96924fc3-b273-4f2d-80ae-329a8e077194" containerName="manager" Apr 17 00:00:32.149438 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.149416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.151170 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.151145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-g4992\"" Apr 17 00:00:32.151404 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.151389 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 00:00:32.151718 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.151701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 00:00:32.151788 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.151772 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 00:00:32.157709 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.157676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st"] Apr 17 00:00:32.237458 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc6515a4-a118-409a-b833-1cf5128755dc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.237728 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.237842 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.237842 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.237974 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzvk\" (UniqueName: \"kubernetes.io/projected/dc6515a4-a118-409a-b833-1cf5128755dc-kube-api-access-dgzvk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.237974 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.237957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338408 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc6515a4-a118-409a-b833-1cf5128755dc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338551 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338551 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338551 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338551 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzvk\" (UniqueName: \"kubernetes.io/projected/dc6515a4-a118-409a-b833-1cf5128755dc-kube-api-access-dgzvk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338551 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.338971 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.339053 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.338972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.339053 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.339003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.340766 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.340748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc6515a4-a118-409a-b833-1cf5128755dc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.341188 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.341169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc6515a4-a118-409a-b833-1cf5128755dc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.345540 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.345516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzvk\" (UniqueName: \"kubernetes.io/projected/dc6515a4-a118-409a-b833-1cf5128755dc-kube-api-access-dgzvk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st\" (UID: \"dc6515a4-a118-409a-b833-1cf5128755dc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.460243 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.460175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:32.583400 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.583276 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st"] Apr 17 00:00:32.585983 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:00:32.585953 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6515a4_a118_409a_b833_1cf5128755dc.slice/crio-9bb15389dec3af88dfdc4fd0a17312c98896feaef50626448c11db21e8530cb0 WatchSource:0}: Error finding container 9bb15389dec3af88dfdc4fd0a17312c98896feaef50626448c11db21e8530cb0: Status 404 returned error can't find the container with id 9bb15389dec3af88dfdc4fd0a17312c98896feaef50626448c11db21e8530cb0 Apr 17 00:00:32.848979 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:32.848944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" event={"ID":"dc6515a4-a118-409a-b833-1cf5128755dc","Type":"ContainerStarted","Data":"9bb15389dec3af88dfdc4fd0a17312c98896feaef50626448c11db21e8530cb0"} Apr 17 00:00:38.873675 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:38.873637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" event={"ID":"dc6515a4-a118-409a-b833-1cf5128755dc","Type":"ContainerStarted","Data":"7b1276827ec5c0b8b4c077224582f1d20c58f07a11e29cf556677d62748454ef"} Apr 17 00:00:43.893296 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:43.893265 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc6515a4-a118-409a-b833-1cf5128755dc" containerID="7b1276827ec5c0b8b4c077224582f1d20c58f07a11e29cf556677d62748454ef" exitCode=0 Apr 17 00:00:43.893680 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:43.893330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" event={"ID":"dc6515a4-a118-409a-b833-1cf5128755dc","Type":"ContainerDied","Data":"7b1276827ec5c0b8b4c077224582f1d20c58f07a11e29cf556677d62748454ef"} Apr 17 00:00:43.893851 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:43.893836 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:00:47.910760 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:47.910724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" event={"ID":"dc6515a4-a118-409a-b833-1cf5128755dc","Type":"ContainerStarted","Data":"0ffe6607fe97cf16e87c457ff86ccf1af27b1ac8d3a656d877c5a6e71e752300"} Apr 17 00:00:47.911171 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:47.911075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:47.927072 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:47.927025 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" podStartSLOduration=0.798644854 podStartE2EDuration="15.927011013s" podCreationTimestamp="2026-04-17 00:00:32 +0000 UTC" firstStartedPulling="2026-04-17 00:00:32.587819238 +0000 UTC m=+617.133809798" lastFinishedPulling="2026-04-17 00:00:47.716185397 +0000 UTC m=+632.262175957" observedRunningTime="2026-04-17 00:00:47.925290554 +0000 UTC m=+632.471281136" watchObservedRunningTime="2026-04-17 00:00:47.927011013 +0000 UTC m=+632.473001595" Apr 17 00:00:53.940642 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:53.940555 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z"] Apr 17 00:00:53.942771 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:53.942756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:53.944654 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:53.944616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 00:00:53.954376 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:53.954352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z"] Apr 17 00:00:54.009719 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.009880 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.009880 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zghvz\" (UniqueName: \"kubernetes.io/projected/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kube-api-access-zghvz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.009880 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.009880 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.009880 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.009844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110541 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110541 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110756 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zghvz\" (UniqueName: \"kubernetes.io/projected/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kube-api-access-zghvz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110756 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110756 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.110756 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.110646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.111082 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.111045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.111173 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.111079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.111229 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.111190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.113279 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.113253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.113389 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.113350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.117097 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.117080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zghvz\" (UniqueName: \"kubernetes.io/projected/b56c139e-24d4-4986-a07b-a47cf0cc7b9a-kube-api-access-zghvz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-qj47z\" (UID: \"b56c139e-24d4-4986-a07b-a47cf0cc7b9a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.255660 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.255594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:00:54.375175 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.375150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z"] Apr 17 00:00:54.377219 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:00:54.377189 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56c139e_24d4_4986_a07b_a47cf0cc7b9a.slice/crio-c1570794d1cdfcf5efa2dac61c391dfd53207542c9e8ba9693cdb753666c00e5 WatchSource:0}: Error finding container c1570794d1cdfcf5efa2dac61c391dfd53207542c9e8ba9693cdb753666c00e5: Status 404 returned error can't find the container with id c1570794d1cdfcf5efa2dac61c391dfd53207542c9e8ba9693cdb753666c00e5 Apr 17 00:00:54.934938 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.934871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" event={"ID":"b56c139e-24d4-4986-a07b-a47cf0cc7b9a","Type":"ContainerStarted","Data":"9772e771c90b7b23be02ea7df1ddd7d122a1319916e65716e57e112916766340"} Apr 17 00:00:54.934938 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:54.934942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" event={"ID":"b56c139e-24d4-4986-a07b-a47cf0cc7b9a","Type":"ContainerStarted","Data":"c1570794d1cdfcf5efa2dac61c391dfd53207542c9e8ba9693cdb753666c00e5"} Apr 17 00:00:56.635838 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.635805 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj"] Apr 17 00:00:56.638240 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.638219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.640116 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.640090 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 00:00:56.648521 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.648503 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj"] Apr 17 00:00:56.733356 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.733526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.733526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhf2\" (UniqueName: \"kubernetes.io/projected/1011a3e0-0c27-455a-b570-bb1353296a67-kube-api-access-xhhf2\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.733526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.733713 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1011a3e0-0c27-455a-b570-bb1353296a67-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.733713 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.733569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834264 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834264 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834521 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhf2\" (UniqueName: \"kubernetes.io/projected/1011a3e0-0c27-455a-b570-bb1353296a67-kube-api-access-xhhf2\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834521 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834521 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1011a3e0-0c27-455a-b570-bb1353296a67-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834521 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834737 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834791 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.834791 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.834777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.836741 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.836710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1011a3e0-0c27-455a-b570-bb1353296a67-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.837120 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.837100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1011a3e0-0c27-455a-b570-bb1353296a67-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.841172 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.841150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhf2\" (UniqueName: \"kubernetes.io/projected/1011a3e0-0c27-455a-b570-bb1353296a67-kube-api-access-xhhf2\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-zzvwj\" (UID: \"1011a3e0-0c27-455a-b570-bb1353296a67\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:56.951635 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:56.951555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:00:57.087469 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:57.087442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj"] Apr 17 00:00:57.088947 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:00:57.088897 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1011a3e0_0c27_455a_b570_bb1353296a67.slice/crio-22c5d5de9ae10b0e5d850c5b6b80010907bf615d1e6479ef26c7bb4eb133ab2c WatchSource:0}: Error finding container 22c5d5de9ae10b0e5d850c5b6b80010907bf615d1e6479ef26c7bb4eb133ab2c: Status 404 returned error can't find the container with id 22c5d5de9ae10b0e5d850c5b6b80010907bf615d1e6479ef26c7bb4eb133ab2c Apr 17 00:00:57.947391 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:57.947343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" event={"ID":"1011a3e0-0c27-455a-b570-bb1353296a67","Type":"ContainerStarted","Data":"66f8b395ca122da90d456d6501f9a09ef48a35ffafd6b29e5097a8eed8b849b5"} Apr 17 00:00:57.947841 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:57.947400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" event={"ID":"1011a3e0-0c27-455a-b570-bb1353296a67","Type":"ContainerStarted","Data":"22c5d5de9ae10b0e5d850c5b6b80010907bf615d1e6479ef26c7bb4eb133ab2c"} Apr 17 00:00:58.927879 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:58.927844 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st" Apr 17 00:00:59.955738 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:59.955706 2576 generic.go:358] "Generic (PLEG): container finished" podID="b56c139e-24d4-4986-a07b-a47cf0cc7b9a" containerID="9772e771c90b7b23be02ea7df1ddd7d122a1319916e65716e57e112916766340" exitCode=0 Apr 17 00:00:59.956066 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:00:59.955797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" event={"ID":"b56c139e-24d4-4986-a07b-a47cf0cc7b9a","Type":"ContainerDied","Data":"9772e771c90b7b23be02ea7df1ddd7d122a1319916e65716e57e112916766340"} Apr 17 00:01:00.960939 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:00.960873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" event={"ID":"b56c139e-24d4-4986-a07b-a47cf0cc7b9a","Type":"ContainerStarted","Data":"e22132ad9342daa270c4e60ff45b9ef72d79fecf5657d5de119e2bb7d915e98d"} Apr 17 00:01:00.961434 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:00.961118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:01:00.976380 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:00.976338 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" podStartSLOduration=7.81579183 podStartE2EDuration="7.976324863s" podCreationTimestamp="2026-04-17 00:00:53 +0000 UTC" firstStartedPulling="2026-04-17 00:00:59.956676466 +0000 UTC m=+644.502667040" lastFinishedPulling="2026-04-17 00:01:00.117209512 +0000 UTC m=+644.663200073" observedRunningTime="2026-04-17 00:01:00.975312318 +0000 UTC m=+645.521302896" watchObservedRunningTime="2026-04-17 00:01:00.976324863 +0000 UTC m=+645.522315445" Apr 17 00:01:02.969550 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:02.969517 2576 generic.go:358] "Generic (PLEG): container finished" podID="1011a3e0-0c27-455a-b570-bb1353296a67" containerID="66f8b395ca122da90d456d6501f9a09ef48a35ffafd6b29e5097a8eed8b849b5" exitCode=0 Apr 17 00:01:02.969901 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:02.969569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" event={"ID":"1011a3e0-0c27-455a-b570-bb1353296a67","Type":"ContainerDied","Data":"66f8b395ca122da90d456d6501f9a09ef48a35ffafd6b29e5097a8eed8b849b5"} Apr 17 00:01:03.974251 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:03.974214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" event={"ID":"1011a3e0-0c27-455a-b570-bb1353296a67","Type":"ContainerStarted","Data":"056cf810b94822a6f114d9940dcf93a2472d93d2f002e219faea330967bac5d9"} Apr 17 00:01:03.974718 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:03.974449 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:01:03.990369 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:03.990318 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" podStartSLOduration=7.810071144 podStartE2EDuration="7.990302461s" podCreationTimestamp="2026-04-17 00:00:56 +0000 UTC" firstStartedPulling="2026-04-17 00:01:02.970171933 +0000 UTC m=+647.516162492" lastFinishedPulling="2026-04-17 00:01:03.150403249 +0000 UTC m=+647.696393809" observedRunningTime="2026-04-17 00:01:03.989311057 +0000 UTC m=+648.535301639" watchObservedRunningTime="2026-04-17 00:01:03.990302461 +0000 UTC m=+648.536293044" Apr 17 00:01:11.981458 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:11.981432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-qj47z" Apr 17 00:01:14.995284 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:01:14.995241 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-zzvwj" Apr 17 00:03:29.721421 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:29.721382 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:03:29.721870 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:29.721624 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f8d44945c-vkmdf" podUID="e1ba7f97-4062-44e0-961f-ed1407be1035" containerName="manager" containerID="cri-o://397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92" gracePeriod=10 Apr 17 00:03:29.964308 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:29.964283 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:03:30.048473 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.048444 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnmmv\" (UniqueName: \"kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv\") pod \"e1ba7f97-4062-44e0-961f-ed1407be1035\" (UID: \"e1ba7f97-4062-44e0-961f-ed1407be1035\") " Apr 17 00:03:30.050673 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.050650 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv" (OuterVolumeSpecName: "kube-api-access-xnmmv") pod "e1ba7f97-4062-44e0-961f-ed1407be1035" (UID: "e1ba7f97-4062-44e0-961f-ed1407be1035"). InnerVolumeSpecName "kube-api-access-xnmmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:03:30.149652 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.149621 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnmmv\" (UniqueName: \"kubernetes.io/projected/e1ba7f97-4062-44e0-961f-ed1407be1035-kube-api-access-xnmmv\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 17 00:03:30.452868 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.452773 2576 generic.go:358] "Generic (PLEG): container finished" podID="e1ba7f97-4062-44e0-961f-ed1407be1035" containerID="397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92" exitCode=0 Apr 17 00:03:30.453055 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.452863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-vkmdf" event={"ID":"e1ba7f97-4062-44e0-961f-ed1407be1035","Type":"ContainerDied","Data":"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92"} Apr 17 00:03:30.453055 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.452875 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-vkmdf" Apr 17 00:03:30.453055 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.452902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-vkmdf" event={"ID":"e1ba7f97-4062-44e0-961f-ed1407be1035","Type":"ContainerDied","Data":"9fccbfc7a9ffaacfd32da54b766658dec2b3592efde767d0e5877bbbe87645a9"} Apr 17 00:03:30.453055 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.452934 2576 scope.go:117] "RemoveContainer" containerID="397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92" Apr 17 00:03:30.461785 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.461764 2576 scope.go:117] "RemoveContainer" containerID="397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92" Apr 17 00:03:30.462055 ip-10-0-133-231 kubenswrapper[2576]: E0417 00:03:30.462037 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92\": container with ID starting with 397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92 not found: ID does not exist" containerID="397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92" Apr 17 00:03:30.462123 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.462065 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92"} err="failed to get container status \"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92\": rpc error: code = NotFound desc = could not find container \"397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92\": container with ID starting with 397415f7185d51b81beadd0f7d7fab37da8e1e3e54c8b13228296059eec41c92 not found: ID does not exist" Apr 17 00:03:30.473111 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.473081 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:03:30.474967 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:30.474944 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f8d44945c-vkmdf"] Apr 17 00:03:31.196800 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.196768 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f8d44945c-6qhjp"] Apr 17 00:03:31.197170 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.197082 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1ba7f97-4062-44e0-961f-ed1407be1035" containerName="manager" Apr 17 00:03:31.197170 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.197093 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba7f97-4062-44e0-961f-ed1407be1035" containerName="manager" Apr 17 00:03:31.197170 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.197156 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1ba7f97-4062-44e0-961f-ed1407be1035" containerName="manager" Apr 17 00:03:31.201096 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.201079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:31.203048 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.203028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-f2s24\"" Apr 17 00:03:31.208524 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.208488 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f8d44945c-6qhjp"] Apr 17 00:03:31.359659 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.359627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66wms\" (UniqueName: \"kubernetes.io/projected/13bc901c-4d36-4fdf-aa96-f7e5dfc89082-kube-api-access-66wms\") pod \"maas-controller-f8d44945c-6qhjp\" (UID: \"13bc901c-4d36-4fdf-aa96-f7e5dfc89082\") " pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:31.460211 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.460145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66wms\" (UniqueName: \"kubernetes.io/projected/13bc901c-4d36-4fdf-aa96-f7e5dfc89082-kube-api-access-66wms\") pod \"maas-controller-f8d44945c-6qhjp\" (UID: \"13bc901c-4d36-4fdf-aa96-f7e5dfc89082\") " pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:31.467143 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.467116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66wms\" (UniqueName: \"kubernetes.io/projected/13bc901c-4d36-4fdf-aa96-f7e5dfc89082-kube-api-access-66wms\") pod \"maas-controller-f8d44945c-6qhjp\" (UID: \"13bc901c-4d36-4fdf-aa96-f7e5dfc89082\") " pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:31.511727 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.511706 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:31.635467 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.635423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f8d44945c-6qhjp"] Apr 17 00:03:31.636250 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:03:31.636222 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13bc901c_4d36_4fdf_aa96_f7e5dfc89082.slice/crio-19b9d76146ff1d01412d9477496c994e6c89bb88daddf6c31fe18671151e58dd WatchSource:0}: Error finding container 19b9d76146ff1d01412d9477496c994e6c89bb88daddf6c31fe18671151e58dd: Status 404 returned error can't find the container with id 19b9d76146ff1d01412d9477496c994e6c89bb88daddf6c31fe18671151e58dd Apr 17 00:03:31.981092 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:31.981057 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ba7f97-4062-44e0-961f-ed1407be1035" path="/var/lib/kubelet/pods/e1ba7f97-4062-44e0-961f-ed1407be1035/volumes" Apr 17 00:03:32.461805 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:32.461774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-6qhjp" event={"ID":"13bc901c-4d36-4fdf-aa96-f7e5dfc89082","Type":"ContainerStarted","Data":"fa3149b4cd8c01471e5ef0417fe9fa7e249a735674376b4c582aef2eadf1bdf5"} Apr 17 00:03:32.461805 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:32.461811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f8d44945c-6qhjp" event={"ID":"13bc901c-4d36-4fdf-aa96-f7e5dfc89082","Type":"ContainerStarted","Data":"19b9d76146ff1d01412d9477496c994e6c89bb88daddf6c31fe18671151e58dd"} Apr 17 00:03:32.462245 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:32.461909 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:03:32.475980 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:32.475899 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f8d44945c-6qhjp" podStartSLOduration=0.902071711 podStartE2EDuration="1.475883981s" podCreationTimestamp="2026-04-17 00:03:31 +0000 UTC" firstStartedPulling="2026-04-17 00:03:31.637613555 +0000 UTC m=+796.183604115" lastFinishedPulling="2026-04-17 00:03:32.211425826 +0000 UTC m=+796.757416385" observedRunningTime="2026-04-17 00:03:32.474992017 +0000 UTC m=+797.020982599" watchObservedRunningTime="2026-04-17 00:03:32.475883981 +0000 UTC m=+797.021874565" Apr 17 00:03:43.471506 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:03:43.471433 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f8d44945c-6qhjp" Apr 17 00:05:15.953718 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:05:15.953640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:05:15.958675 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:05:15.958654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:10:15.975776 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:10:15.975746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:10:15.983016 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:10:15.982993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:15:15.997216 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:15:15.997190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:15:16.005450 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:15:16.005430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:20:16.017102 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:20:16.017076 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:20:16.032572 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:20:16.032551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:24:31.427496 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:31.427468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-vtshf_e1c923b6-f354-48cb-971f-893ba483dc99/manager/0.log" Apr 17 00:24:31.545937 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:31.545900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-9b67c8845-hhnj6_262af08c-4532-4a8c-b1c5-5919af6414aa/maas-api/0.log" Apr 17 00:24:31.657069 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:31.657037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f8d44945c-6qhjp_13bc901c-4d36-4fdf-aa96-f7e5dfc89082/manager/0.log" Apr 17 00:24:31.766434 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:31.766369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gcnvw_d46e57f6-8d63-4f84-9e7a-e5e61281e599/manager/1.log" Apr 17 00:24:32.001545 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:32.001519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-9gm8q_2f2d0abb-045d-414c-a5d0-9ba3f291a473/manager/0.log" Apr 17 00:24:32.253726 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:32.253700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rdrwh_46fd5449-3f1d-4e95-9bd4-4d837df7c0ba/postgres/0.log" Apr 17 00:24:33.618375 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:33.618347 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-w2w4s_925fcd4a-e982-4f90-b9a1-99b50b7e6140/manager/0.log" Apr 17 00:24:34.734278 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:34.734213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-kckq4_c588d5fa-35ce-4320-94e8-2b3c03dae725/discovery/0.log" Apr 17 00:24:34.944474 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:34.944446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5cd78b4564-xxxhs_2686cb8c-f64c-447d-bd8e-f25b04de1e6b/kube-auth-proxy/0.log" Apr 17 00:24:35.499814 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.499787 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-qj47z_b56c139e-24d4-4986-a07b-a47cf0cc7b9a/storage-initializer/0.log" Apr 17 00:24:35.506556 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.506536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-qj47z_b56c139e-24d4-4986-a07b-a47cf0cc7b9a/main/0.log" Apr 17 00:24:35.613504 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.613483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-zzvwj_1011a3e0-0c27-455a-b570-bb1353296a67/storage-initializer/0.log" Apr 17 00:24:35.619703 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.619680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-zzvwj_1011a3e0-0c27-455a-b570-bb1353296a67/main/0.log" Apr 17 00:24:35.849311 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.849277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st_dc6515a4-a118-409a-b833-1cf5128755dc/main/0.log" Apr 17 00:24:35.855482 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:35.855463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcczv9st_dc6515a4-a118-409a-b833-1cf5128755dc/storage-initializer/0.log" Apr 17 00:24:42.760680 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:42.760639 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tnq5s_c517f342-1f92-4fea-88bf-76e1e2f71358/global-pull-secret-syncer/0.log" Apr 17 00:24:42.853111 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:42.853082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qcsbq_71c22abd-5fe1-440f-8a1c-d7fd92526d8f/konnectivity-agent/0.log" Apr 17 00:24:42.893651 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:42.893621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-231.ec2.internal_1eb73e8eaa1b833503296b19a264c17c/haproxy/0.log" Apr 17 00:24:47.417087 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:47.417050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-w2w4s_925fcd4a-e982-4f90-b9a1-99b50b7e6140/manager/0.log" Apr 17 00:24:49.664678 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:49.664645 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/node-exporter/0.log" Apr 17 00:24:49.688349 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:49.688322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/kube-rbac-proxy/0.log" Apr 17 00:24:49.706814 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:49.706795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8w79_c1bd9626-2f26-4387-95e3-983d628930db/init-textfile/0.log" Apr 17 00:24:51.612430 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.612396 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt"] Apr 17 00:24:51.615579 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.615563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.617639 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.617609 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"openshift-service-ca.crt\"" Apr 17 00:24:51.617760 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.617701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vl859\"/\"default-dockercfg-cqxln\"" Apr 17 00:24:51.618139 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.618125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"kube-root-ca.crt\"" Apr 17 00:24:51.624402 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.624379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt"] Apr 17 00:24:51.683480 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.683450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-podres\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.683619 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.683500 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-sys\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.683619 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.683556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-proc\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.683619 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.683612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnzs\" (UniqueName: \"kubernetes.io/projected/922e250f-56bd-4116-9b2b-ec8a2c11a89a-kube-api-access-knnzs\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.683746 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.683651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-lib-modules\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784341 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-lib-modules\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784454 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-podres\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784454 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-sys\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-podres\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-lib-modules\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784526 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-sys\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784619 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-proc\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784619 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knnzs\" (UniqueName: \"kubernetes.io/projected/922e250f-56bd-4116-9b2b-ec8a2c11a89a-kube-api-access-knnzs\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.784679 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.784633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/922e250f-56bd-4116-9b2b-ec8a2c11a89a-proc\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.793135 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.793109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnzs\" (UniqueName: \"kubernetes.io/projected/922e250f-56bd-4116-9b2b-ec8a2c11a89a-kube-api-access-knnzs\") pod \"perf-node-gather-daemonset-j4nnt\" (UID: \"922e250f-56bd-4116-9b2b-ec8a2c11a89a\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:51.925648 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:51.925590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:52.049078 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.049052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt"] Apr 17 00:24:52.051329 ip-10-0-133-231 kubenswrapper[2576]: W0417 00:24:52.051297 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod922e250f_56bd_4116_9b2b_ec8a2c11a89a.slice/crio-65c96e0fd82b505b1a8bc0386a53effe270fdd03c704cd25db857f1ee98b4c79 WatchSource:0}: Error finding container 65c96e0fd82b505b1a8bc0386a53effe270fdd03c704cd25db857f1ee98b4c79: Status 404 returned error can't find the container with id 65c96e0fd82b505b1a8bc0386a53effe270fdd03c704cd25db857f1ee98b4c79 Apr 17 00:24:52.053010 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.052993 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:24:52.686015 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.685978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" event={"ID":"922e250f-56bd-4116-9b2b-ec8a2c11a89a","Type":"ContainerStarted","Data":"c0ce2a17534a02f1106df73f5a93d408639d97ba89db802061abf5252b03c170"} Apr 17 00:24:52.686015 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.686013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" event={"ID":"922e250f-56bd-4116-9b2b-ec8a2c11a89a","Type":"ContainerStarted","Data":"65c96e0fd82b505b1a8bc0386a53effe270fdd03c704cd25db857f1ee98b4c79"} Apr 17 00:24:52.686395 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.686075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:24:52.701893 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:52.701850 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" podStartSLOduration=1.70183659 podStartE2EDuration="1.70183659s" podCreationTimestamp="2026-04-17 00:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:24:52.699371627 +0000 UTC m=+2077.245362212" watchObservedRunningTime="2026-04-17 00:24:52.70183659 +0000 UTC m=+2077.247827173" Apr 17 00:24:53.685575 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:53.685547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fmhdc_7f0fc05f-8b6e-4d52-8962-dda3bf88746a/dns/0.log" Apr 17 00:24:53.703986 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:53.703963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fmhdc_7f0fc05f-8b6e-4d52-8962-dda3bf88746a/kube-rbac-proxy/0.log" Apr 17 00:24:53.762462 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:53.762444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-747kz_3d193d29-372b-44a9-a007-2f9fd389e08e/dns-node-resolver/0.log" Apr 17 00:24:54.297982 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:54.297952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4ldzm_a6c8c77b-23d8-444b-93d9-efd10d5f4f5b/node-ca/0.log" Apr 17 00:24:55.220449 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:55.220415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-kckq4_c588d5fa-35ce-4320-94e8-2b3c03dae725/discovery/0.log" Apr 17 00:24:55.262752 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:55.262724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5cd78b4564-xxxhs_2686cb8c-f64c-447d-bd8e-f25b04de1e6b/kube-auth-proxy/0.log" Apr 17 00:24:55.887316 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:55.887290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k8zp4_bd90a043-dac9-4e2d-96a1-4eabae55281f/serve-healthcheck-canary/0.log" Apr 17 00:24:56.408972 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:56.408946 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9q77n_3fef4520-3fbc-4e8b-a599-35236b6977ca/kube-rbac-proxy/0.log" Apr 17 00:24:56.429823 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:56.429793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9q77n_3fef4520-3fbc-4e8b-a599-35236b6977ca/exporter/0.log" Apr 17 00:24:56.449503 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:56.449473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9q77n_3fef4520-3fbc-4e8b-a599-35236b6977ca/extractor/0.log" Apr 17 00:24:58.357366 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.357309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-vtshf_e1c923b6-f354-48cb-971f-893ba483dc99/manager/0.log" Apr 17 00:24:58.388493 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.388460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-9b67c8845-hhnj6_262af08c-4532-4a8c-b1c5-5919af6414aa/maas-api/0.log" Apr 17 00:24:58.439223 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.439198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f8d44945c-6qhjp_13bc901c-4d36-4fdf-aa96-f7e5dfc89082/manager/0.log" Apr 17 00:24:58.456290 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.456268 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gcnvw_d46e57f6-8d63-4f84-9e7a-e5e61281e599/manager/0.log" Apr 17 00:24:58.468176 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.468150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gcnvw_d46e57f6-8d63-4f84-9e7a-e5e61281e599/manager/1.log" Apr 17 00:24:58.519135 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.519113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-9gm8q_2f2d0abb-045d-414c-a5d0-9ba3f291a473/manager/0.log" Apr 17 00:24:58.580713 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.580694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rdrwh_46fd5449-3f1d-4e95-9bd4-4d837df7c0ba/postgres/0.log" Apr 17 00:24:58.699595 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:24:58.699543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-j4nnt" Apr 17 00:25:05.940148 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:05.940114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/kube-multus-additional-cni-plugins/0.log" Apr 17 00:25:05.958831 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:05.958809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/egress-router-binary-copy/0.log" Apr 17 00:25:05.977158 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:05.977133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/cni-plugins/0.log" Apr 17 00:25:05.996146 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:05.996128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/bond-cni-plugin/0.log" Apr 17 00:25:06.014550 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.014528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/routeoverride-cni/0.log" Apr 17 00:25:06.035187 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.035163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/whereabouts-cni-bincopy/0.log" Apr 17 00:25:06.053150 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.053132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5gfp_4af2a1c3-1ef3-407b-b7ab-2cdd03b858cc/whereabouts-cni/0.log" Apr 17 00:25:06.079124 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.079104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dtpng_63883006-764d-4455-b7c7-6289c17bdd27/kube-multus/0.log" Apr 17 00:25:06.207594 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.207531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ktsl6_c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7/network-metrics-daemon/0.log" Apr 17 00:25:06.224217 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:06.224197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ktsl6_c74b0b0a-4b58-4ad4-9c2b-fcd27ad15de7/kube-rbac-proxy/0.log" Apr 17 00:25:07.017344 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.017319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-controller/0.log" Apr 17 00:25:07.033196 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.033171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/0.log" Apr 17 00:25:07.042711 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.042690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovn-acl-logging/1.log" Apr 17 00:25:07.058639 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.058620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/kube-rbac-proxy-node/0.log" Apr 17 00:25:07.076061 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.076041 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:25:07.090529 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.090509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/northd/0.log" Apr 17 00:25:07.107476 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.107451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/nbdb/0.log" Apr 17 00:25:07.127375 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.127329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/sbdb/0.log" Apr 17 00:25:07.226604 ip-10-0-133-231 kubenswrapper[2576]: I0417 00:25:07.226578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btfdz_f39171be-e0ce-40eb-86b2-8d51c766008b/ovnkube-controller/0.log"