Apr 20 20:11:23.999063 ip-10-0-129-247 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:11:24.521177 ip-10-0-129-247 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:24.521177 ip-10-0-129-247 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:11:24.521177 ip-10-0-129-247 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:24.521177 ip-10-0-129-247 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:11:24.521177 ip-10-0-129-247 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:24.521923 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.521240 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:11:24.529338 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529317 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:24.529338 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529333 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:24.529338 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529340 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:24.529338 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529344 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529347 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529351 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529354 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529356 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529359 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529361 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529364 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529366 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529369 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529372 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529375 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529377 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529380 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529383 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529385 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529388 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529390 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529393 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:24.529503 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529396 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529398 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529402 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529404 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529407 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529409 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529412 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529415 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529417 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529420 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529422 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529424 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529427 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529429 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529432 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529436 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529438 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529441 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529443 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529445 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:24.529966 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529448 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529451 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529453 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529460 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529463 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529465 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529468 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529472 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529474 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529477 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529479 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529482 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529485 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529487 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529490 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529492 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529495 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529497 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529500 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529503 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:24.530442 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529505 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529508 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529510 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529513 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529515 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529518 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529521 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529523 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529526 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529529 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529531 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529535 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529540 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529543 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529546 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529549 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529557 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529560 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529563 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529566 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:24.530920 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529568 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529571 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529574 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529576 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529972 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529977 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529980 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529983 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529986 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529989 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529991 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529994 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529997 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.529999 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530002 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530004 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530007 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530009 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530013 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530016 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:24.531399 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530018 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530021 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530023 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530026 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530028 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530031 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530034 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530036 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530039 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530042 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530044 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530047 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530049 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530052 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530054 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530058 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530062 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530065 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530068 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530070 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:24.531905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530073 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530076 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530078 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530081 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530084 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530086 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530089 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530092 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530094 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530097 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530100 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530102 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530105 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530107 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530110 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530112 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530115 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530117 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530119 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:24.532401 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530122 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530124 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530127 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530130 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530132 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530135 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530137 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530139 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530142 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530144 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530146 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530149 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530151 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530154 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530156 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530158 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530166 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530169 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530172 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:24.532886 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530175 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530178 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530180 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530183 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530186 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530189 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530191 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530194 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530196 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530200 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530203 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530205 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530285 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530295 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530305 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530311 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530319 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530322 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530327 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530332 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530335 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:11:24.533372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530338 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530342 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530345 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530348 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530351 2576 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530354 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530356 2576 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530359 2576 flags.go:64] FLAG: --cloud-config="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530362 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530364 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530369 2576 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530371 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530374 2576 flags.go:64] FLAG: --config-dir="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530377 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530380 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530389 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530392 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530395 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530399 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530401 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530404 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530407 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530410 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530413 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530417 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:11:24.533879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530420 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530423 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530426 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530429 2576 flags.go:64] FLAG: --enable-server="true" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530432 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530437 2576 flags.go:64] FLAG: --event-burst="100" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530440 2576 flags.go:64] FLAG: --event-qps="50" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530442 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530446 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530449 2576 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530453 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530456 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530459 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530462 2576 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530465 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530468 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530470 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530473 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530476 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530479 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530481 2576 flags.go:64] FLAG: --feature-gates="" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530485 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530490 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530493 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530497 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530500 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:11:24.534505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530503 2576 flags.go:64] FLAG: --help="false" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530505 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530508 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530511 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530514 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530517 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530520 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530522 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530525 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530528 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530532 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530535 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530538 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530541 2576 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530544 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530546 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530549 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530552 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530555 2576 flags.go:64] FLAG: --lock-file="" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530557 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530560 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530563 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530568 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:11:24.535208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530571 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530573 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530576 2576 flags.go:64] FLAG: --logging-format="text" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530579 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530582 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530587 2576 flags.go:64] FLAG: --manifest-url="" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530590 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530594 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530597 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530601 2576 flags.go:64] FLAG: --max-pods="110" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530604 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530607 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530610 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530613 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530616 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530618 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530621 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530628 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530631 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530635 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530638 2576 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530641 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530646 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530649 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:11:24.535759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530652 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530654 2576 flags.go:64] FLAG: --port="10250" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530657 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530660 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e1f21e22d616fe67" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530663 2576 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530666 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530668 2576 flags.go:64] FLAG: --register-node="true" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530671 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530674 2576 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530677 2576 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530680 2576 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530682 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530685 2576 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530690 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530697 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530700 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530703 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530705 2576 flags.go:64] FLAG: --runonce="false" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530709 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530712 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530714 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530717 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530720 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530723 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530726 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530728 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:11:24.536371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530731 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530752 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530755 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530759 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530762 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530765 2576 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530767 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530773 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530775 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530778 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530782 2576 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530784 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530787 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530790 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530792 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530795 2576 flags.go:64] FLAG: --v="2" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530800 2576 flags.go:64] FLAG: --version="false" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530804 2576 flags.go:64] FLAG: --vmodule="" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530809 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.530816 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530908 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530912 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530915 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530917 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:24.537032 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530921 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530924 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530927 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530929 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530932 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530935 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530937 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530940 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530942 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530945 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530948 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530951 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530954 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530956 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530959 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530961 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530964 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530966 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530969 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530971 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:24.537595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530973 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530976 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530978 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530981 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530983 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530985 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530987 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530992 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.530997 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531000 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531003 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531006 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531008 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531010 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531013 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531015 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531018 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531020 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531022 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531025 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:24.538138 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531027 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531030 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531032 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531034 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531037 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531040 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531042 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531045 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531047 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531050 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531052 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531055 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531057 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531059 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531062 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531064 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531066 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531069 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531072 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531074 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:24.538634 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531078 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531081 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531084 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531086 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531089 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531091 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531094 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531096 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531100 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531103 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531106 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531109 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531111 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531114 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531117 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531119 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531122 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531124 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531127 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:24.539148 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531129 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531132 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.531135 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.531813 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.539032 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.539137 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539189 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539194 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539197 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539201 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539204 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539207 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539211 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539216 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539219 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:24.539618 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539222 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539225 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539228 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539231 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539233 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539236 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539238 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539241 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539244 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539246 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539249 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539251 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539254 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539257 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539259 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539262 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539265 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539268 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539270 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539273 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:24.540021 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539275 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539277 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539280 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539284 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539286 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539289 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539291 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539294 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539296 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539299 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539301 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539304 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539306 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539309 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539311 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539314 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539316 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539319 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539321 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539323 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:24.540489 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539326 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539328 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539331 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539333 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539336 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539339 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539342 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539344 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539346 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539350 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539352 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539354 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539357 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539359 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539362 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539364 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539367 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539369 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539372 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539374 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:24.540989 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539377 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539379 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539383 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539386 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539389 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539391 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539394 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539396 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539399 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539401 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539404 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539406 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539423 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539427 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539429 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539432 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:24.541506 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539435 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.539440 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539529 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539533 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539536 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539539 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539542 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539545 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539548 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539551 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539553 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539556 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539558 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539561 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539563 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539566 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:24.541907 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539568 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539571 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539573 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539576 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539578 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539581 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539583 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539586 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539588 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539591 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539593 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539596 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539598 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539600 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539603 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539605 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539607 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539610 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539613 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539616 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:24.542301 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539618 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539621 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539624 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539627 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539630 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539632 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539635 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539637 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539640 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539642 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539645 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539647 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539649 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539652 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539654 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539657 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539659 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539661 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539664 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:24.542917 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539668 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539680 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539683 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539686 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539690 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539692 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539695 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539697 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539700 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539702 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539705 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539708 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539710 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539713 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539715 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539718 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539720 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539723 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539726 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539728 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:24.543371 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539730 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539750 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539753 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539756 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539759 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539761 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539764 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539766 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539768 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539772 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539775 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539778 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:24.539780 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.539785 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.540845 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:11:24.543905 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.542824 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:11:24.544289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.543829 2576 server.go:1019] "Starting client certificate rotation" Apr 20 20:11:24.544289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.543931 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:24.544289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.543966 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:24.570438 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.570422 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:24.573427 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.573413 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:24.590723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.590703 2576 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:11:24.597593 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.597576 2576 log.go:25] "Validated CRI v1 image API" Apr 20 20:11:24.598798 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.598782 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:11:24.602344 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.602325 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:24.605622 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.605599 2576 fs.go:135] Filesystem UUIDs: map[2fc7eca7-a985-458c-80e3-3d5de2d30da1:/dev/nvme0n1p3 4c79899f-24e1-4fbf-a725-21cd2c3afdd9:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 20:11:24.605698 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.605621 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:11:24.611383 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.611273 2576 manager.go:217] Machine: {Timestamp:2026-04-20 20:11:24.609214906 +0000 UTC m=+0.478993529 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108384 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23c52c5d26894e2eaa1b9cf66e3a80 SystemUUID:ec23c52c-5d26-894e-2eaa-1b9cf66e3a80 BootID:3cdc1e7f-fd75-474d-9e0b-3bcee50a6bed Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:20:46:0a:62:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:20:46:0a:62:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:ee:ae:16:cb:69 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:11:24.611383 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.611370 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:11:24.611544 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.611505 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:11:24.611848 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.611825 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:11:24.612021 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.611848 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-247.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:11:24.612106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.612034 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:11:24.612106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.612047 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:11:24.612106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.612070 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:24.613062 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.613051 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:24.614423 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.614411 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:24.614553 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.614542 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:11:24.617493 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.617482 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:11:24.617551 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.617500 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:11:24.617551 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.617517 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:11:24.617551 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.617530 2576 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:11:24.617551 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.617546 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:11:24.618750 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.618722 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:24.618826 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.618764 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:24.620972 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.620948 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rg7sf" Apr 20 20:11:24.624893 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.624876 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:11:24.626153 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.626137 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:11:24.627875 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.627861 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rg7sf" Apr 20 20:11:24.628645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628631 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628657 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628666 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628684 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628695 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628705 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:11:24.628723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628719 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:11:24.628975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628729 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:11:24.628975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628762 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:11:24.628975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628778 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:11:24.628975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628799 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:11:24.628975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.628817 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:11:24.629997 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.629976 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:11:24.630291 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.630221 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:11:24.630376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.630333 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:11:24.631018 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.630030 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-247.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:11:24.633164 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.633148 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-247.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:11:24.635330 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.635315 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:11:24.635370 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.635363 2576 server.go:1295] "Started kubelet" Apr 20 20:11:24.635460 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.635410 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:11:24.635554 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.635510 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:11:24.635614 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.635573 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:11:24.636189 ip-10-0-129-247 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:11:24.636705 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.636523 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:11:24.637241 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.637223 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:11:24.642839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.642822 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:11:24.642839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.642832 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:24.643564 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643543 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:11:24.643564 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643556 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643578 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643663 2576 factory.go:55] Registering systemd factory Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.643708 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643716 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643732 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:11:24.643719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.643760 2576 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:11:24.644086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644073 2576 factory.go:153] Registering CRI-O factory Apr 20 20:11:24.644086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644088 2576 factory.go:223] Registration of the crio container factory successfully Apr 20 20:11:24.644148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644127 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:11:24.644148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644146 2576 factory.go:103] Registering Raw factory Apr 20 20:11:24.644208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644156 2576 manager.go:1196] Started watching for new ooms in manager Apr 20 20:11:24.644460 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.644447 2576 manager.go:319] Starting recovery of all containers Apr 20 20:11:24.647395 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.647366 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:24.649557 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.649536 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-247.ec2.internal\" not found" node="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.657510 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.657493 2576 manager.go:324] Recovery completed Apr 20 20:11:24.661650 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.661638 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.664125 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664108 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.664204 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664137 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.664204 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664148 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.664731 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664522 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:11:24.664812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664766 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:11:24.664812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.664793 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:24.667105 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.667090 2576 policy_none.go:49] "None policy: Start" Apr 20 20:11:24.667173 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.667115 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:11:24.667173 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.667126 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:11:24.702350 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702336 2576 manager.go:341] "Starting Device Plugin manager" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.702375 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702389 2576 server.go:85] "Starting device plugin registration server" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702601 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702612 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702705 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702799 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.702812 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.703364 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:11:24.712538 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.703401 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:24.778829 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.778782 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:11:24.780047 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.780027 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:11:24.780120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.780056 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:11:24.780120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.780073 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:11:24.780120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.780080 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:11:24.780257 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.780136 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:11:24.782018 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.781999 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:24.803082 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.803064 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.803811 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.803795 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.803890 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.803823 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.803890 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.803836 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.803890 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.803860 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.811628 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.811614 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.811679 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.811633 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-247.ec2.internal\": node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:24.822219 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.822201 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:24.880514 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.880487 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal"] Apr 20 20:11:24.880583 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.880566 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.881313 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.881299 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.881390 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.881325 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.881390 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.881335 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.882337 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882325 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.882490 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.882525 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882507 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.882934 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882920 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.883008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882946 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.883008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882958 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.883008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882973 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.883008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882989 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.883008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.882998 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.883985 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.883971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.884035 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.883994 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:24.884619 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.884607 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:24.884689 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.884629 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:24.884689 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.884644 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:24.904244 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.904228 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-247.ec2.internal\" not found" node="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.907591 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.907577 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-247.ec2.internal\" not found" node="ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.922446 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:24.922429 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:24.944940 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.944919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.944995 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.944942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:24.944995 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:24.944972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e535b6aadad4bcde1b5b507c970d0c4f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-247.ec2.internal\" (UID: \"e535b6aadad4bcde1b5b507c970d0c4f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.022947 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.022928 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:25.045252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.045314 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.045314 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.045389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e535b6aadad4bcde1b5b507c970d0c4f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-247.ec2.internal\" (UID: \"e535b6aadad4bcde1b5b507c970d0c4f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.045389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e535b6aadad4bcde1b5b507c970d0c4f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-247.ec2.internal\" (UID: \"e535b6aadad4bcde1b5b507c970d0c4f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.045389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.045372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bc6936a6a8564836169d34a32da252-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal\" (UID: \"53bc6936a6a8564836169d34a32da252\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.123468 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.123447 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:25.206927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.206910 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.210523 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.210504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.224105 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.224087 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:25.324655 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.324602 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:25.425100 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.425078 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-247.ec2.internal\" not found" Apr 20 20:11:25.444781 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.444764 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:25.544031 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.544012 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.544031 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.544024 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:11:25.544635 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.544155 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:25.544635 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.544188 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:25.544635 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.544188 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:25.567109 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.567092 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:25.569405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.569394 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" Apr 20 20:11:25.577456 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.577424 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:25.618590 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.618573 2576 apiserver.go:52] "Watching apiserver" Apr 20 20:11:25.625772 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.625757 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:11:25.626809 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.626788 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz","openshift-cluster-node-tuning-operator/tuned-bd5bk","openshift-dns/node-resolver-rwqjv","openshift-multus/network-metrics-daemon-c89q8","kube-system/konnectivity-agent-lptm4","openshift-image-registry/node-ca-hgbbc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal","openshift-multus/multus-additional-cni-plugins-6lhxt","openshift-multus/multus-nbfv2","openshift-network-diagnostics/network-check-target-qcvnc","openshift-network-operator/iptables-alerter-4vqkf","openshift-ovn-kubernetes/ovnkube-node-jq6pm","kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal"] Apr 20 20:11:25.629303 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.629283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:25.629385 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.629365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:25.630388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.630374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.630494 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.630479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.632586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.632560 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:06:24 +0000 UTC" deadline="2027-10-13 16:59:04.379758625 +0000 UTC" Apr 20 20:11:25.632586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.632582 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12980h47m38.747178672s" Apr 20 20:11:25.632784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.632664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.632784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.632753 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.632929 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.632808 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.633026 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.633005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.633136 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.633069 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:25.634117 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.633247 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.634117 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.633498 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2xsp\"" Apr 20 20:11:25.634117 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.633496 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qr2n8\"" Apr 20 20:11:25.635930 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.634605 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.635930 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.634971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.636405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.636386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:11:25.636405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.636391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:11:25.636577 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.636469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sbtll\"" Apr 20 20:11:25.637075 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:11:25.637453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637432 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.637453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637444 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.637621 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.637621 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637609 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.637757 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.637721 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x9bmk\"" Apr 20 20:11:25.639561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.639547 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:11:25.639699 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.639664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.639868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.639852 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:11:25.639932 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.639881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.639982 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.639934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n495z\"" Apr 20 20:11:25.640028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.640006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:11:25.640177 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.640151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.640364 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.640345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:11:25.640455 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.640401 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-c9c95\"" Apr 20 20:11:25.641133 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.641117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.642063 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xg57g\"" Apr 20 20:11:25.642204 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.642499 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642425 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.642499 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642449 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.642499 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:11:25.642904 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642889 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:25.642981 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.642918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.643211 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.643194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sn7f8\"" Apr 20 20:11:25.643508 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.643495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.643508 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.643504 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:11:25.644221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644204 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.644627 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:11:25.644696 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644643 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:11:25.644696 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644649 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:11:25.644807 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.644882 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2gbkm\"" Apr 20 20:11:25.644928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:11:25.644928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.644909 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:11:25.647778 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbtvj\" (UniqueName: \"kubernetes.io/projected/f285fa38-b5da-431e-b1a4-e25936eb49d9-kube-api-access-zbtvj\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.647851 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-kubelet\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.647851 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.647851 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.647944 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-system-cni-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.647944 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.647944 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-netns\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.647944 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f285fa38-b5da-431e-b1a4-e25936eb49d9-iptables-alerter-script\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.648056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f856405-928f-4e0f-a6a4-b56a19061640-host\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.648056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.647988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-system-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-bin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-hostroot\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-conf-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-multus-certs\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-etc-kubernetes\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-etc-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-systemd\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-sys-fs\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f165a6a5-16a4-48c1-8e9a-9819f7939466-hosts-file\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smvlp\" (UniqueName: \"kubernetes.io/projected/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-kube-api-access-smvlp\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-sys\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cni-binary-copy\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-daemon-config\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-node-log\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.648634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-config\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-env-overrides\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff5c2154-884b-4487-9566-aadfee1d17e4-agent-certs\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff5c2154-884b-4487-9566-aadfee1d17e4-konnectivity-ca\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f285fa38-b5da-431e-b1a4-e25936eb49d9-host-slash\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-slash\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-netns\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-var-lib-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-k8s-cni-cncf-io\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-registration-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.648989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-log-socket\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-bin\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-os-release\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-kubelet\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lb4g\" (UniqueName: \"kubernetes.io/projected/7ff1077e-9f9b-4382-9161-a7e4d8da5193-kube-api-access-7lb4g\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdkr\" (UniqueName: \"kubernetes.io/projected/2f856405-928f-4e0f-a6a4-b56a19061640-kube-api-access-6jdkr\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-device-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-host\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-socket-dir-parent\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-lib-modules\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-tmp\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cnibin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-multus\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysconfig\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-cnibin\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-os-release\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-socket-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7mrr\" (UniqueName: \"kubernetes.io/projected/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kube-api-access-k7mrr\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d54\" (UniqueName: \"kubernetes.io/projected/36e9f673-1266-48a1-8251-c342b5b20ca7-kube-api-access-h2d54\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.649842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfth\" (UniqueName: \"kubernetes.io/projected/a9aaf97f-201c-4eed-b64e-a78004674964-kube-api-access-mmfth\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-systemd-units\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-systemd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-ovn\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-conf\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-run\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f165a6a5-16a4-48c1-8e9a-9819f7939466-tmp-dir\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovn-node-metrics-cert\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-kubernetes\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-var-lib-kubelet\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrwr\" (UniqueName: \"kubernetes.io/projected/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-kube-api-access-fkrwr\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-netd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-script-lib\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-modprobe-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-tuned\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f856405-928f-4e0f-a6a4-b56a19061640-serviceca\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49dl\" (UniqueName: \"kubernetes.io/projected/f165a6a5-16a4-48c1-8e9a-9819f7939466-kube-api-access-h49dl\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.650308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.649817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.653568 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.653553 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:25.675116 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.675100 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9n5h4" Apr 20 20:11:25.682655 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.682636 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9n5h4" Apr 20 20:11:25.750450 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-lib-modules\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.750645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-tmp\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.750645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.750645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-lib-modules\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.750645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cnibin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-multus\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cnibin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysconfig\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-multus\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-cnibin\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750767 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-os-release\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-socket-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7mrr\" (UniqueName: \"kubernetes.io/projected/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kube-api-access-k7mrr\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-cnibin\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.750891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d54\" (UniqueName: \"kubernetes.io/projected/36e9f673-1266-48a1-8251-c342b5b20ca7-kube-api-access-h2d54\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-os-release\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfth\" (UniqueName: \"kubernetes.io/projected/a9aaf97f-201c-4eed-b64e-a78004674964-kube-api-access-mmfth\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-systemd-units\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-socket-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.750993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-systemd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-ovn\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-systemd-units\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-conf\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-systemd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-run\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f165a6a5-16a4-48c1-8e9a-9819f7939466-tmp-dir\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovn-node-metrics-cert\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-conf\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-run\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-kubernetes\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-var-lib-kubelet\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.751433 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-kubernetes\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysconfig\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrwr\" (UniqueName: \"kubernetes.io/projected/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-kube-api-access-fkrwr\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-var-lib-kubelet\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-netd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-script-lib\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f165a6a5-16a4-48c1-8e9a-9819f7939466-tmp-dir\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-netd\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-modprobe-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-tuned\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-ovn\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f856405-928f-4e0f-a6a4-b56a19061640-serviceca\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h49dl\" (UniqueName: \"kubernetes.io/projected/f165a6a5-16a4-48c1-8e9a-9819f7939466-kube-api-access-h49dl\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbtvj\" (UniqueName: \"kubernetes.io/projected/f285fa38-b5da-431e-b1a4-e25936eb49d9-kube-api-access-zbtvj\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-modprobe-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-kubelet\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-kubelet\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.752336 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.751822 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.751893 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.251872305 +0000 UTC m=+2.121650931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-run-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-system-cni-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-netns\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.751997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f856405-928f-4e0f-a6a4-b56a19061640-serviceca\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f285fa38-b5da-431e-b1a4-e25936eb49d9-iptables-alerter-script\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f856405-928f-4e0f-a6a4-b56a19061640-host\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-system-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-netns\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-bin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-hostroot\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f856405-928f-4e0f-a6a4-b56a19061640-host\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.753165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-system-cni-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-conf-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-multus-certs\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-cni-bin\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-etc-kubernetes\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-etc-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-system-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-systemd\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-multus-certs\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-hostroot\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-conf-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-sys-fs\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f165a6a5-16a4-48c1-8e9a-9819f7939466-hosts-file\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-etc-kubernetes\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.753978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f165a6a5-16a4-48c1-8e9a-9819f7939466-hosts-file\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smvlp\" (UniqueName: \"kubernetes.io/projected/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-kube-api-access-smvlp\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-sys\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cni-binary-copy\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-daemon-config\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-node-log\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f285fa38-b5da-431e-b1a4-e25936eb49d9-iptables-alerter-script\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-config\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-env-overrides\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff5c2154-884b-4487-9566-aadfee1d17e4-agent-certs\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff5c2154-884b-4487-9566-aadfee1d17e4-konnectivity-ca\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.754996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-cni-dir\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f285fa38-b5da-431e-b1a4-e25936eb49d9-host-slash\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-slash\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9aaf97f-201c-4eed-b64e-a78004674964-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-netns\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.752980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-etc-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-systemd\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-sysctl-d\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-node-log\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-run-netns\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-var-lib-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-k8s-cni-cncf-io\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-registration-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-log-socket\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-bin\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-os-release\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.755830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-kubelet\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lb4g\" (UniqueName: \"kubernetes.io/projected/7ff1077e-9f9b-4382-9161-a7e4d8da5193-kube-api-access-7lb4g\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdkr\" (UniqueName: \"kubernetes.io/projected/2f856405-928f-4e0f-a6a4-b56a19061640-kube-api-access-6jdkr\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-device-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-daemon-config\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-env-overrides\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-run-k8s-cni-cncf-io\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-host\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-registration-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-sys-fs\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-log-socket\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9aaf97f-201c-4eed-b64e-a78004674964-os-release\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-cni-bin\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-host-var-lib-kubelet\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-sys\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-tmp\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e9f673-1266-48a1-8251-c342b5b20ca7-host\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-socket-dir-parent\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.756543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.753992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-device-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ff1077e-9f9b-4382-9161-a7e4d8da5193-multus-socket-dir-parent\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f285fa38-b5da-431e-b1a4-e25936eb49d9-host-slash\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-host-slash\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff1077e-9f9b-4382-9161-a7e4d8da5193-cni-binary-copy\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovn-node-metrics-cert\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-var-lib-openvswitch\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-script-lib\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff5c2154-884b-4487-9566-aadfee1d17e4-konnectivity-ca\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.754575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-ovnkube-config\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.756389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/36e9f673-1266-48a1-8251-c342b5b20ca7-etc-tuned\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.757084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.756718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff5c2154-884b-4487-9566-aadfee1d17e4-agent-certs\") pod \"konnectivity-agent-lptm4\" (UID: \"ff5c2154-884b-4487-9566-aadfee1d17e4\") " pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.760299 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.760195 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:25.760299 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.760212 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:25.760299 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.760224 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.760299 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:25.760291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.260257445 +0000 UTC m=+2.130036059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.762846 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.762821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49dl\" (UniqueName: \"kubernetes.io/projected/f165a6a5-16a4-48c1-8e9a-9819f7939466-kube-api-access-h49dl\") pod \"node-resolver-rwqjv\" (UID: \"f165a6a5-16a4-48c1-8e9a-9819f7939466\") " pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.763183 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.763122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrwr\" (UniqueName: \"kubernetes.io/projected/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-kube-api-access-fkrwr\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:25.763753 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.763711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfth\" (UniqueName: \"kubernetes.io/projected/a9aaf97f-201c-4eed-b64e-a78004674964-kube-api-access-mmfth\") pod \"multus-additional-cni-plugins-6lhxt\" (UID: \"a9aaf97f-201c-4eed-b64e-a78004674964\") " pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:25.763844 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.763716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smvlp\" (UniqueName: \"kubernetes.io/projected/bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f-kube-api-access-smvlp\") pod \"ovnkube-node-jq6pm\" (UID: \"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.763844 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.763798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7mrr\" (UniqueName: \"kubernetes.io/projected/88eff2a4-dc19-4994-a317-cf7e1c6fb6b4-kube-api-access-k7mrr\") pod \"aws-ebs-csi-driver-node-tspxz\" (UID: \"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:25.763989 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.763899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lb4g\" (UniqueName: \"kubernetes.io/projected/7ff1077e-9f9b-4382-9161-a7e4d8da5193-kube-api-access-7lb4g\") pod \"multus-nbfv2\" (UID: \"7ff1077e-9f9b-4382-9161-a7e4d8da5193\") " pod="openshift-multus/multus-nbfv2" Apr 20 20:11:25.764158 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.764140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdkr\" (UniqueName: \"kubernetes.io/projected/2f856405-928f-4e0f-a6a4-b56a19061640-kube-api-access-6jdkr\") pod \"node-ca-hgbbc\" (UID: \"2f856405-928f-4e0f-a6a4-b56a19061640\") " pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:25.764954 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.764933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d54\" (UniqueName: \"kubernetes.io/projected/36e9f673-1266-48a1-8251-c342b5b20ca7-kube-api-access-h2d54\") pod \"tuned-bd5bk\" (UID: \"36e9f673-1266-48a1-8251-c342b5b20ca7\") " pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.765289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.765272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbtvj\" (UniqueName: \"kubernetes.io/projected/f285fa38-b5da-431e-b1a4-e25936eb49d9-kube-api-access-zbtvj\") pod \"iptables-alerter-4vqkf\" (UID: \"f285fa38-b5da-431e-b1a4-e25936eb49d9\") " pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.797759 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.797570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4vqkf" Apr 20 20:11:25.802480 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.802459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:25.844169 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:25.844142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53bc6936a6a8564836169d34a32da252.slice/crio-b81ebcb2e389b43a3824bdf62a55cbcd480cd6acfea908945d159f9b2a6d265b WatchSource:0}: Error finding container b81ebcb2e389b43a3824bdf62a55cbcd480cd6acfea908945d159f9b2a6d265b: Status 404 returned error can't find the container with id b81ebcb2e389b43a3824bdf62a55cbcd480cd6acfea908945d159f9b2a6d265b Apr 20 20:11:25.849132 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.849092 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:25.967136 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.967116 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" Apr 20 20:11:25.972261 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:25.972237 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e9f673_1266_48a1_8251_c342b5b20ca7.slice/crio-7dbc73d38028b50522cbd9fb17811afc765c933f1c0ecadfa29c9a9f7de71419 WatchSource:0}: Error finding container 7dbc73d38028b50522cbd9fb17811afc765c933f1c0ecadfa29c9a9f7de71419: Status 404 returned error can't find the container with id 7dbc73d38028b50522cbd9fb17811afc765c933f1c0ecadfa29c9a9f7de71419 Apr 20 20:11:25.972918 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.972901 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:25.980416 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.980400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rwqjv" Apr 20 20:11:25.984897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:25.984878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:25.986096 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:25.986076 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf165a6a5_16a4_48c1_8e9a_9819f7939466.slice/crio-8746eb8e64c78374443594de8f381b2fab3b3b2b4d0175ac58a8e2ec7dc4eabe WatchSource:0}: Error finding container 8746eb8e64c78374443594de8f381b2fab3b3b2b4d0175ac58a8e2ec7dc4eabe: Status 404 returned error can't find the container with id 8746eb8e64c78374443594de8f381b2fab3b3b2b4d0175ac58a8e2ec7dc4eabe Apr 20 20:11:25.990316 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:25.990296 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5c2154_884b_4487_9566_aadfee1d17e4.slice/crio-254ae71411ef86a6930e4c3c54fc72b1eac2bdf76498a21b79278084339b5516 WatchSource:0}: Error finding container 254ae71411ef86a6930e4c3c54fc72b1eac2bdf76498a21b79278084339b5516: Status 404 returned error can't find the container with id 254ae71411ef86a6930e4c3c54fc72b1eac2bdf76498a21b79278084339b5516 Apr 20 20:11:26.001776 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.001759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgbbc" Apr 20 20:11:26.007346 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.007329 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f856405_928f_4e0f_a6a4_b56a19061640.slice/crio-06a444d180e69d94a78e26ee67735d90956c549f208f6db3b44b4a0c1bea78e6 WatchSource:0}: Error finding container 06a444d180e69d94a78e26ee67735d90956c549f208f6db3b44b4a0c1bea78e6: Status 404 returned error can't find the container with id 06a444d180e69d94a78e26ee67735d90956c549f208f6db3b44b4a0c1bea78e6 Apr 20 20:11:26.019001 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.018987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" Apr 20 20:11:26.020132 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.020113 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode535b6aadad4bcde1b5b507c970d0c4f.slice/crio-77be7699820abfec1e4aefd6e289f6912c156aa23150a0f28bc936d51d3a4b04 WatchSource:0}: Error finding container 77be7699820abfec1e4aefd6e289f6912c156aa23150a0f28bc936d51d3a4b04: Status 404 returned error can't find the container with id 77be7699820abfec1e4aefd6e289f6912c156aa23150a0f28bc936d51d3a4b04 Apr 20 20:11:26.024918 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.024891 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9aaf97f_201c_4eed_b64e_a78004674964.slice/crio-9ea4470532690ca33ecf2e0c0f80a942b7a93986d47c4bd0780b335594cd1a85 WatchSource:0}: Error finding container 9ea4470532690ca33ecf2e0c0f80a942b7a93986d47c4bd0780b335594cd1a85: Status 404 returned error can't find the container with id 9ea4470532690ca33ecf2e0c0f80a942b7a93986d47c4bd0780b335594cd1a85 Apr 20 20:11:26.030973 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.030960 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nbfv2" Apr 20 20:11:26.036479 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.036456 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff1077e_9f9b_4382_9161_a7e4d8da5193.slice/crio-97d0e4fb7b0a6916cfa50bc1583aacf31d42f9bd31534240d499a4cb4efbcc6a WatchSource:0}: Error finding container 97d0e4fb7b0a6916cfa50bc1583aacf31d42f9bd31534240d499a4cb4efbcc6a: Status 404 returned error can't find the container with id 97d0e4fb7b0a6916cfa50bc1583aacf31d42f9bd31534240d499a4cb4efbcc6a Apr 20 20:11:26.055808 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.055794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" Apr 20 20:11:26.060678 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.060656 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88eff2a4_dc19_4994_a317_cf7e1c6fb6b4.slice/crio-67f33c855f08e103b240ce950903aa5be6416a4d304e124333f96e1a9a68bd54 WatchSource:0}: Error finding container 67f33c855f08e103b240ce950903aa5be6416a4d304e124333f96e1a9a68bd54: Status 404 returned error can't find the container with id 67f33c855f08e103b240ce950903aa5be6416a4d304e124333f96e1a9a68bd54 Apr 20 20:11:26.141499 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.141480 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf285fa38_b5da_431e_b1a4_e25936eb49d9.slice/crio-aafa262586cbf551d703f2c5315c29c2a6266d921470db33768b80d3f6ae7776 WatchSource:0}: Error finding container aafa262586cbf551d703f2c5315c29c2a6266d921470db33768b80d3f6ae7776: Status 404 returned error can't find the container with id aafa262586cbf551d703f2c5315c29c2a6266d921470db33768b80d3f6ae7776 Apr 20 20:11:26.257753 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.257714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:26.257866 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.257844 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:26.257920 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.257909 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:27.257889009 +0000 UTC m=+3.127667623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:26.358569 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.358526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:26.358705 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.358685 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:26.358791 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.358710 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:26.358791 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.358719 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:26.358791 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:26.358777 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:27.358762942 +0000 UTC m=+3.228541571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:26.372129 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.372108 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:26.441956 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:11:26.441292 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7aaee9_0fd7_4edc_b3a6_ea8cc2c8856f.slice/crio-7f2201b39e65e5ec7ba65c00e8c470f01d7b9209e5614bde66b8ad42a729c676 WatchSource:0}: Error finding container 7f2201b39e65e5ec7ba65c00e8c470f01d7b9209e5614bde66b8ad42a729c676: Status 404 returned error can't find the container with id 7f2201b39e65e5ec7ba65c00e8c470f01d7b9209e5614bde66b8ad42a729c676 Apr 20 20:11:26.685339 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.685249 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:25 +0000 UTC" deadline="2027-10-31 14:20:29.53474975 +0000 UTC" Apr 20 20:11:26.685339 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.685287 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13410h9m2.849467237s" Apr 20 20:11:26.790072 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.789857 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:26.797680 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.797590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"7f2201b39e65e5ec7ba65c00e8c470f01d7b9209e5614bde66b8ad42a729c676"} Apr 20 20:11:26.800464 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.800418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4vqkf" event={"ID":"f285fa38-b5da-431e-b1a4-e25936eb49d9","Type":"ContainerStarted","Data":"aafa262586cbf551d703f2c5315c29c2a6266d921470db33768b80d3f6ae7776"} Apr 20 20:11:26.805546 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.805503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" event={"ID":"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4","Type":"ContainerStarted","Data":"67f33c855f08e103b240ce950903aa5be6416a4d304e124333f96e1a9a68bd54"} Apr 20 20:11:26.813069 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.813040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerStarted","Data":"9ea4470532690ca33ecf2e0c0f80a942b7a93986d47c4bd0780b335594cd1a85"} Apr 20 20:11:26.816092 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.816006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lptm4" event={"ID":"ff5c2154-884b-4487-9566-aadfee1d17e4","Type":"ContainerStarted","Data":"254ae71411ef86a6930e4c3c54fc72b1eac2bdf76498a21b79278084339b5516"} Apr 20 20:11:26.821839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.821751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" event={"ID":"36e9f673-1266-48a1-8251-c342b5b20ca7","Type":"ContainerStarted","Data":"7dbc73d38028b50522cbd9fb17811afc765c933f1c0ecadfa29c9a9f7de71419"} Apr 20 20:11:26.834495 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.834432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nbfv2" event={"ID":"7ff1077e-9f9b-4382-9161-a7e4d8da5193","Type":"ContainerStarted","Data":"97d0e4fb7b0a6916cfa50bc1583aacf31d42f9bd31534240d499a4cb4efbcc6a"} Apr 20 20:11:26.841554 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.841526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" event={"ID":"e535b6aadad4bcde1b5b507c970d0c4f","Type":"ContainerStarted","Data":"77be7699820abfec1e4aefd6e289f6912c156aa23150a0f28bc936d51d3a4b04"} Apr 20 20:11:26.852593 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.852568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgbbc" event={"ID":"2f856405-928f-4e0f-a6a4-b56a19061640","Type":"ContainerStarted","Data":"06a444d180e69d94a78e26ee67735d90956c549f208f6db3b44b4a0c1bea78e6"} Apr 20 20:11:26.859977 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.859952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rwqjv" event={"ID":"f165a6a5-16a4-48c1-8e9a-9819f7939466","Type":"ContainerStarted","Data":"8746eb8e64c78374443594de8f381b2fab3b3b2b4d0175ac58a8e2ec7dc4eabe"} Apr 20 20:11:26.866221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:26.866197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" event={"ID":"53bc6936a6a8564836169d34a32da252","Type":"ContainerStarted","Data":"b81ebcb2e389b43a3824bdf62a55cbcd480cd6acfea908945d159f9b2a6d265b"} Apr 20 20:11:27.265695 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.265613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:27.265875 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.265781 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:27.265875 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.265843 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:29.265823429 +0000 UTC m=+5.135602043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:27.366718 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.366639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:27.366884 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.366849 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:27.366884 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.366870 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:27.366884 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.366883 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:27.367034 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.366936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:29.366919349 +0000 UTC m=+5.236697965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:27.686323 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.686277 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:25 +0000 UTC" deadline="2027-12-21 16:04:50.075667938 +0000 UTC" Apr 20 20:11:27.686323 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.686320 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14635h53m22.389351788s" Apr 20 20:11:27.781066 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.781035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:27.781279 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.781252 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:27.781695 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:27.781677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:27.781851 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:27.781830 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:28.686008 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:28.685788 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:29.280634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:29.280564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:29.281098 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.280766 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:29.281098 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.280830 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:33.280809762 +0000 UTC m=+9.150588374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:29.381803 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:29.381769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:29.381971 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.381931 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:29.381971 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.381950 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:29.381971 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.381961 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:29.382122 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.382017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:33.381997925 +0000 UTC m=+9.251776551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:29.780887 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:29.780860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:29.781062 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.780989 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:29.781131 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:29.781087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:29.781183 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:29.781167 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:31.781108 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:31.781074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:31.781565 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:31.781218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:31.781654 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:31.781632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:31.781800 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:31.781730 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:33.315389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:33.315354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:33.315814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.315489 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:33.315814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.315538 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.315523353 +0000 UTC m=+17.185301963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:33.415711 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:33.415667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:33.415889 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.415860 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:33.415948 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.415882 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:33.415948 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.415911 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:33.416049 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.415966 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.41594772 +0000 UTC m=+17.285726350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:33.781452 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:33.780955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:33.781452 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:33.780963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:33.781452 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.781090 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:33.781452 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:33.781191 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:35.780821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:35.780786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:35.780821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:35.780804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:35.781295 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:35.780916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:35.781295 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:35.781033 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:37.780720 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:37.780674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:37.781091 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:37.780758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:37.781091 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:37.780841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:37.781091 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:37.780955 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:39.780821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:39.780799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:39.781178 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:39.780888 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:39.781178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:39.780801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:39.781178 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:39.781022 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:41.376833 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:41.376799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:41.377309 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.376976 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.377309 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.377053 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.377032968 +0000 UTC m=+33.246811582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.477657 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:41.477630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:41.477830 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.477807 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:41.477830 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.477827 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:41.477900 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.477837 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:41.477900 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.477891 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.477873411 +0000 UTC m=+33.347652036 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:41.780706 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:41.780677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:41.780881 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:41.780689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:41.780881 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.780809 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:41.780881 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:41.780853 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:43.781200 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:43.781168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:43.781627 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:43.781168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:43.781627 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:43.781293 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:43.781627 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:43.781396 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:44.739344 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:44.739110 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff1077e_9f9b_4382_9161_a7e4d8da5193.slice/crio-d7c73348b2f316259dc6f31350e883551a25ac90e0554e524b64272e7a3a956b.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:11:44.918971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.918940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nbfv2" event={"ID":"7ff1077e-9f9b-4382-9161-a7e4d8da5193","Type":"ContainerStarted","Data":"d7c73348b2f316259dc6f31350e883551a25ac90e0554e524b64272e7a3a956b"} Apr 20 20:11:44.921563 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.921529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" event={"ID":"e535b6aadad4bcde1b5b507c970d0c4f","Type":"ContainerStarted","Data":"75aa20671bbf5c7126a9fcc3aca6c7d5780d88b6d2872e6776fbd6597b51aaf8"} Apr 20 20:11:44.927220 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.927203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:11:44.927560 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.927537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"4a7dee904b074710e200982acac5ad1a885fba3ba5ece8f4de33744713a19653"} Apr 20 20:11:44.927665 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.927568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"dc627ec0de171dfb18f0d84b18378507d16d9619a48525963067f7c82ecd013f"} Apr 20 20:11:44.927665 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.927583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"ec0dd7aa6baa3f02187c0f8a64f52e439ba14d31c62c75534da442c01edf830e"} Apr 20 20:11:44.927665 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.927594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"60b813cf6a53f194f235082a5674af8e1a35a3074f882203e34476e7f0eb93ac"} Apr 20 20:11:44.929164 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.929138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" event={"ID":"36e9f673-1266-48a1-8251-c342b5b20ca7","Type":"ContainerStarted","Data":"8b9ba639e38702187f9f109349dc660b74caa70bed82236983dce001719af175"} Apr 20 20:11:44.936016 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.935192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nbfv2" podStartSLOduration=2.289038347 podStartE2EDuration="20.935176285s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.037821858 +0000 UTC m=+1.907600467" lastFinishedPulling="2026-04-20 20:11:44.68395978 +0000 UTC m=+20.553738405" observedRunningTime="2026-04-20 20:11:44.934522616 +0000 UTC m=+20.804301284" watchObservedRunningTime="2026-04-20 20:11:44.935176285 +0000 UTC m=+20.804954921" Apr 20 20:11:44.949899 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.949792 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bd5bk" podStartSLOduration=2.527465232 podStartE2EDuration="20.949780641s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:25.973782357 +0000 UTC m=+1.843560967" lastFinishedPulling="2026-04-20 20:11:44.396097759 +0000 UTC m=+20.265876376" observedRunningTime="2026-04-20 20:11:44.949252011 +0000 UTC m=+20.819030665" watchObservedRunningTime="2026-04-20 20:11:44.949780641 +0000 UTC m=+20.819559272" Apr 20 20:11:44.962967 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:44.962928 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-247.ec2.internal" podStartSLOduration=19.96291379 podStartE2EDuration="19.96291379s" podCreationTimestamp="2026-04-20 20:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:44.962175055 +0000 UTC m=+20.831953698" watchObservedRunningTime="2026-04-20 20:11:44.96291379 +0000 UTC m=+20.832692422" Apr 20 20:11:45.780845 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.780818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:45.780967 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.780818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:45.780967 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:45.780944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:45.781076 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:45.780995 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:45.932752 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.932703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgbbc" event={"ID":"2f856405-928f-4e0f-a6a4-b56a19061640","Type":"ContainerStarted","Data":"16e10b6b53ce711ca4b71486ada103a558194344824fa905b5055b28f673203b"} Apr 20 20:11:45.934091 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.934057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rwqjv" event={"ID":"f165a6a5-16a4-48c1-8e9a-9819f7939466","Type":"ContainerStarted","Data":"9d8deadd7b30fc47020e78f839b5f19acf7dad3c2ed5c8e889bf2f8bb0b4a40d"} Apr 20 20:11:45.935864 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.935838 2576 generic.go:358] "Generic (PLEG): container finished" podID="53bc6936a6a8564836169d34a32da252" containerID="6edeb7a1ea54a749edd1f33d9997376a5d8b36d39d1ff92ecd61fb1598300c88" exitCode=0 Apr 20 20:11:45.935976 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.935894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" event={"ID":"53bc6936a6a8564836169d34a32da252","Type":"ContainerDied","Data":"6edeb7a1ea54a749edd1f33d9997376a5d8b36d39d1ff92ecd61fb1598300c88"} Apr 20 20:11:45.938852 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.938831 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:11:45.939173 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.939145 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f" containerID="ec0dd7aa6baa3f02187c0f8a64f52e439ba14d31c62c75534da442c01edf830e" exitCode=1 Apr 20 20:11:45.939288 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.939172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerDied","Data":"ec0dd7aa6baa3f02187c0f8a64f52e439ba14d31c62c75534da442c01edf830e"} Apr 20 20:11:45.939288 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.939199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"1d3013064e8162d17902478ad01d494a10a763dc3e4520bb7557c8ed6583f78f"} Apr 20 20:11:45.939288 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.939209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"9bb0eaa979127cf619a39bb1f945ea04c087f4de1d74722698bad181158c3a17"} Apr 20 20:11:45.940506 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.940482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4vqkf" event={"ID":"f285fa38-b5da-431e-b1a4-e25936eb49d9","Type":"ContainerStarted","Data":"0a47eef18d79d6f76004bd8d2caabec52e51c74dfffcd243abe5a5b9ca4e9423"} Apr 20 20:11:45.941966 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.941946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" event={"ID":"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4","Type":"ContainerStarted","Data":"1e215a3721e4a0270297c633fa97a17b6b96f00b9463d79eba0c22e6977c8366"} Apr 20 20:11:45.943412 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.943391 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="77c9f1e206eaefd553646402c16e0b681711d6383badfefddf09db08f35a219c" exitCode=0 Apr 20 20:11:45.943502 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.943462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"77c9f1e206eaefd553646402c16e0b681711d6383badfefddf09db08f35a219c"} Apr 20 20:11:45.945167 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.945129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lptm4" event={"ID":"ff5c2154-884b-4487-9566-aadfee1d17e4","Type":"ContainerStarted","Data":"1075dab11d5f6ec5a927ab8485ff5ac723b607e13d541cd5b05403e620cdcd15"} Apr 20 20:11:45.948286 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.948242 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hgbbc" podStartSLOduration=3.564504345 podStartE2EDuration="21.948228926s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.008618205 +0000 UTC m=+1.878396813" lastFinishedPulling="2026-04-20 20:11:44.392342772 +0000 UTC m=+20.262121394" observedRunningTime="2026-04-20 20:11:45.948220025 +0000 UTC m=+21.817998657" watchObservedRunningTime="2026-04-20 20:11:45.948228926 +0000 UTC m=+21.818007558" Apr 20 20:11:45.983539 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.983492 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lptm4" podStartSLOduration=3.595739217 podStartE2EDuration="21.983475884s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:25.991751511 +0000 UTC m=+1.861530123" lastFinishedPulling="2026-04-20 20:11:44.379488163 +0000 UTC m=+20.249266790" observedRunningTime="2026-04-20 20:11:45.96179821 +0000 UTC m=+21.831576842" watchObservedRunningTime="2026-04-20 20:11:45.983475884 +0000 UTC m=+21.853254508" Apr 20 20:11:45.997451 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:45.997415 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rwqjv" podStartSLOduration=3.605408415 podStartE2EDuration="21.997404528s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:25.987490307 +0000 UTC m=+1.857268919" lastFinishedPulling="2026-04-20 20:11:44.379486423 +0000 UTC m=+20.249265032" observedRunningTime="2026-04-20 20:11:45.997127842 +0000 UTC m=+21.866906473" watchObservedRunningTime="2026-04-20 20:11:45.997404528 +0000 UTC m=+21.867183159" Apr 20 20:11:46.028109 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.028075 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4vqkf" podStartSLOduration=3.778862799 podStartE2EDuration="22.028063666s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.142772333 +0000 UTC m=+2.012550942" lastFinishedPulling="2026-04-20 20:11:44.391973186 +0000 UTC m=+20.261751809" observedRunningTime="2026-04-20 20:11:46.011858248 +0000 UTC m=+21.881636879" watchObservedRunningTime="2026-04-20 20:11:46.028063666 +0000 UTC m=+21.897842297" Apr 20 20:11:46.363187 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.362976 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:11:46.716559 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.716384 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:11:46.363157587Z","UUID":"77ec244d-c209-4816-9556-21419a14c1fb","Handler":null,"Name":"","Endpoint":""} Apr 20 20:11:46.719954 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.719920 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:11:46.719954 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.719957 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:11:46.949309 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.949257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" event={"ID":"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4","Type":"ContainerStarted","Data":"4f8d7cd2a4f9833dfccdcc5025819eb937f26b2911ec55f72b256881b3798a25"} Apr 20 20:11:46.952257 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:46.951800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" event={"ID":"53bc6936a6a8564836169d34a32da252","Type":"ContainerStarted","Data":"83baeebad5741308e2ae383c79ecc4ef445eb3ca8045763dc2d17009b84b9d6b"} Apr 20 20:11:47.780325 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.780286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:47.780511 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.780287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:47.780511 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:47.780406 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:47.780511 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:47.780502 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:47.957205 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.957176 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:11:47.957870 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.957594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"9ff5abc925ed2acee567651d084dca545da064cf9c46cce4a7bda34961c855bc"} Apr 20 20:11:47.959687 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.959657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" event={"ID":"88eff2a4-dc19-4994-a317-cf7e1c6fb6b4","Type":"ContainerStarted","Data":"d7bcf91a93f38034c47f80cc80a7ee2aa6b9127781b0ca8afc578830c29b066d"} Apr 20 20:11:47.975745 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:47.975704 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-247.ec2.internal" podStartSLOduration=22.975687796 podStartE2EDuration="22.975687796s" podCreationTimestamp="2026-04-20 20:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:46.967460776 +0000 UTC m=+22.837239407" watchObservedRunningTime="2026-04-20 20:11:47.975687796 +0000 UTC m=+23.845466428" Apr 20 20:11:48.741822 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:48.741787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:48.742666 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:48.742641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:48.757027 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:48.756976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tspxz" podStartSLOduration=3.654046471 podStartE2EDuration="24.756959699s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.061950218 +0000 UTC m=+1.931728829" lastFinishedPulling="2026-04-20 20:11:47.164863436 +0000 UTC m=+23.034642057" observedRunningTime="2026-04-20 20:11:47.975513683 +0000 UTC m=+23.845292314" watchObservedRunningTime="2026-04-20 20:11:48.756959699 +0000 UTC m=+24.626738333" Apr 20 20:11:48.961649 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:48.961622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:48.962199 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:48.962025 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lptm4" Apr 20 20:11:49.781010 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:49.780978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:49.781010 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:49.781000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:49.781238 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:49.781102 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:49.781297 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:49.781225 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:50.967300 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.967117 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="723d1ae0fb033b670aef09aeb302ce95b716deace8713252b47339b14aede39c" exitCode=0 Apr 20 20:11:50.967714 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.967217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"723d1ae0fb033b670aef09aeb302ce95b716deace8713252b47339b14aede39c"} Apr 20 20:11:50.970731 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.970713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:11:50.971087 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.971060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"0e6b646e01f365b310ed77745a3da984771c6e1db70fcfbe169e5f7dac1169e6"} Apr 20 20:11:50.971520 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.971503 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:50.971676 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.971643 2576 scope.go:117] "RemoveContainer" containerID="ec0dd7aa6baa3f02187c0f8a64f52e439ba14d31c62c75534da442c01edf830e" Apr 20 20:11:50.988376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:50.988351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:51.781294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.781098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:51.781410 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.781102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:51.781410 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:51.781373 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:51.781507 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:51.781477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:51.855601 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.855580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:51.889952 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.889927 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c89q8"] Apr 20 20:11:51.892463 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.892444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qcvnc"] Apr 20 20:11:51.975837 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.975781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:11:51.976266 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.976085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" event={"ID":"bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f","Type":"ContainerStarted","Data":"275d9bfe87debb8cca91e8832083a3c2f4e9fc6532bf8af92a03055db0edac8c"} Apr 20 20:11:51.976403 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.976386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:51.978130 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.978109 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="cbd35bc7f59125d3c79d45dabc50211df47ea5dd5ba6eefbcdfb27e9d2192965" exitCode=0 Apr 20 20:11:51.978223 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.978140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"cbd35bc7f59125d3c79d45dabc50211df47ea5dd5ba6eefbcdfb27e9d2192965"} Apr 20 20:11:51.978223 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.978194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:51.978338 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.978194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:51.978338 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:51.978277 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:51.978428 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:51.978365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:51.991165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:51.991147 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:11:52.004355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:52.004321 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" podStartSLOduration=9.821081352 podStartE2EDuration="28.00431114s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.443725749 +0000 UTC m=+2.313504358" lastFinishedPulling="2026-04-20 20:11:44.626955534 +0000 UTC m=+20.496734146" observedRunningTime="2026-04-20 20:11:52.003173261 +0000 UTC m=+27.872951905" watchObservedRunningTime="2026-04-20 20:11:52.00431114 +0000 UTC m=+27.874089771" Apr 20 20:11:52.983774 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:52.983688 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="49d0f049898046ff431c67be4e93b2370bd1c2b9a8d24b66ef74d38dc6da915a" exitCode=0 Apr 20 20:11:52.984150 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:52.983781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"49d0f049898046ff431c67be4e93b2370bd1c2b9a8d24b66ef74d38dc6da915a"} Apr 20 20:11:53.780376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:53.780347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:53.780522 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:53.780347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:53.780522 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:53.780466 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:53.780616 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:53.780543 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:55.781239 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:55.781206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:55.781870 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:55.781207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:55.781870 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:55.781348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qcvnc" podUID="1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7" Apr 20 20:11:55.781870 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:55.781399 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c89q8" podUID="62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2" Apr 20 20:11:57.406644 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.406611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:57.407018 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.406785 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:57.407018 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.406849 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:29.406832519 +0000 UTC m=+65.276611129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:57.422899 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.422874 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-247.ec2.internal" event="NodeReady" Apr 20 20:11:57.423016 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.423003 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:11:57.463050 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.463026 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gr79r"] Apr 20 20:11:57.489569 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.489543 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5c5mw"] Apr 20 20:11:57.489763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.489718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.492567 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.492547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:11:57.492784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.492765 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:11:57.492911 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.492797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:11:57.500557 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.500361 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gr79r"] Apr 20 20:11:57.500660 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.500576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5c5mw"] Apr 20 20:11:57.500660 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.500512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.502877 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.502861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:11:57.502992 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.502951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:11:57.502992 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.502967 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:11:57.503079 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.502995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:11:57.507040 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.506920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:57.507401 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.507088 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:57.507401 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.507114 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:57.507401 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.507126 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgdbl for pod openshift-network-diagnostics/network-check-target-qcvnc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:57.507401 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.507170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl podName:1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:29.507157571 +0000 UTC m=+65.376936180 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgdbl" (UniqueName: "kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl") pod "network-check-target-qcvnc" (UID: "1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:57.608212 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.608212 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a04807-4a1d-4a9f-9f26-b677e822247a-config-volume\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.608440 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a04807-4a1d-4a9f-9f26-b677e822247a-tmp-dir\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.608440 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlnbs\" (UniqueName: \"kubernetes.io/projected/75a04807-4a1d-4a9f-9f26-b677e822247a-kube-api-access-vlnbs\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.608440 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh76\" (UniqueName: \"kubernetes.io/projected/ae28c5cd-450f-42d0-a36c-1e045e920a41-kube-api-access-zwh76\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.608440 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.608308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.708791 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.708953 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.708953 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a04807-4a1d-4a9f-9f26-b677e822247a-config-volume\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.708953 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a04807-4a1d-4a9f-9f26-b677e822247a-tmp-dir\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.708953 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlnbs\" (UniqueName: \"kubernetes.io/projected/75a04807-4a1d-4a9f-9f26-b677e822247a-kube-api-access-vlnbs\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.708953 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.708933 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:57.709175 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.708989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh76\" (UniqueName: \"kubernetes.io/projected/ae28c5cd-450f-42d0-a36c-1e045e920a41-kube-api-access-zwh76\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.709175 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.708998 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:57.709175 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.709002 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:58.208982879 +0000 UTC m=+34.078761493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:11:57.709175 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:57.709075 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:11:58.209058277 +0000 UTC m=+34.078836893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:11:57.709378 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.709326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a04807-4a1d-4a9f-9f26-b677e822247a-tmp-dir\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.709549 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.709524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a04807-4a1d-4a9f-9f26-b677e822247a-config-volume\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.720088 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.720068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlnbs\" (UniqueName: \"kubernetes.io/projected/75a04807-4a1d-4a9f-9f26-b677e822247a-kube-api-access-vlnbs\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:57.720238 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.720217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh76\" (UniqueName: \"kubernetes.io/projected/ae28c5cd-450f-42d0-a36c-1e045e920a41-kube-api-access-zwh76\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:57.780676 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.780646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:11:57.780832 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.780646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:11:57.783920 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.783900 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:11:57.784025 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.783943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:11:57.784025 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.783975 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:11:57.784134 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.784035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:11:57.784134 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:57.783902 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:11:58.212937 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:58.212896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:58.213163 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:58.212965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:58.213163 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:58.213067 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:58.213163 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:58.213110 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:58.213163 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:58.213139 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:59.21311702 +0000 UTC m=+35.082895652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:11:58.213163 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:58.213164 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:11:59.213147206 +0000 UTC m=+35.082925818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:11:59.221688 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:59.221663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:11:59.222103 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:11:59.221710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:11:59.222103 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:59.221826 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:59.222103 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:59.221891 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:01.221875467 +0000 UTC m=+37.091654076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:11:59.222103 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:59.221833 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:59.222103 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:11:59.221948 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:12:01.221936521 +0000 UTC m=+37.091715130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:12:00.000813 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:00.000787 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="9fff9eceb6c6cc9e74e8145d327baa7ff5affd0374b10e2ecfcb88bc637e67fb" exitCode=0 Apr 20 20:12:00.000938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:00.000824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"9fff9eceb6c6cc9e74e8145d327baa7ff5affd0374b10e2ecfcb88bc637e67fb"} Apr 20 20:12:01.005207 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:01.005175 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9aaf97f-201c-4eed-b64e-a78004674964" containerID="680ab06f0b49f4d4fcc77d0034df6e11ef9af1635f8cee6bff9b02866a1f0f11" exitCode=0 Apr 20 20:12:01.005570 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:01.005216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerDied","Data":"680ab06f0b49f4d4fcc77d0034df6e11ef9af1635f8cee6bff9b02866a1f0f11"} Apr 20 20:12:01.232763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:01.232724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:12:01.232891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:01.232788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:12:01.232891 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:01.232877 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:01.232996 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:01.232898 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:01.232996 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:01.232922 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:12:05.232909514 +0000 UTC m=+41.102688123 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:12:01.232996 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:01.232958 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:05.232938035 +0000 UTC m=+41.102716648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:12:02.010036 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:02.010008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" event={"ID":"a9aaf97f-201c-4eed-b64e-a78004674964","Type":"ContainerStarted","Data":"4cf3a78d04f6c2c71e1a4577b3fb2edbb60a61a039c79023fca924751cdb07c9"} Apr 20 20:12:02.032715 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:02.032666 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6lhxt" podStartSLOduration=5.167714687 podStartE2EDuration="38.032653215s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:11:26.028540275 +0000 UTC m=+1.898318884" lastFinishedPulling="2026-04-20 20:11:58.893478787 +0000 UTC m=+34.763257412" observedRunningTime="2026-04-20 20:12:02.032456586 +0000 UTC m=+37.902235218" watchObservedRunningTime="2026-04-20 20:12:02.032653215 +0000 UTC m=+37.902431846" Apr 20 20:12:05.259187 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:05.259150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:12:05.259536 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:05.259210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:12:05.259536 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:05.259286 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:05.259536 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:05.259289 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:05.259536 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:05.259331 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.259320348 +0000 UTC m=+49.129098957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:12:05.259536 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:05.259344 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.259338089 +0000 UTC m=+49.129116697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:12:13.308984 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:13.308949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:12:13.309429 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:13.309001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:12:13.309429 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:13.309104 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:13.309429 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:13.309111 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:13.309429 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:13.309172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:12:29.309152376 +0000 UTC m=+65.178930985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:12:13.309429 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:13.309187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:29.309180214 +0000 UTC m=+65.178958822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:12:23.996591 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:23.996551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jq6pm" Apr 20 20:12:28.742550 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.742516 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt"] Apr 20 20:12:28.779084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.779054 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt"] Apr 20 20:12:28.779228 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.779137 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:28.781354 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.781332 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:12:28.781354 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.781335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mdbt2\"" Apr 20 20:12:28.782404 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.782386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:12:28.782611 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.782418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:12:28.782703 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.782474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 20:12:28.906291 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.906267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:28.906392 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:28.906295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6v2\" (UniqueName: \"kubernetes.io/projected/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-kube-api-access-7x6v2\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.007319 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.007265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.007319 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.007295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6v2\" (UniqueName: \"kubernetes.io/projected/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-kube-api-access-7x6v2\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.010686 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.010662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.025099 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.025079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6v2\" (UniqueName: \"kubernetes.io/projected/ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2-kube-api-access-7x6v2\") pod \"managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt\" (UID: \"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.105034 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.105012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" Apr 20 20:12:29.291996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.291969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt"] Apr 20 20:12:29.295119 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:12:29.295094 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded8439d6_e4a4_4f7e_be2f_7de0797ab4f2.slice/crio-bc340b9d38d9a7179b6f343d19f8d82341216acaf11f98f48b20980f169aba52 WatchSource:0}: Error finding container bc340b9d38d9a7179b6f343d19f8d82341216acaf11f98f48b20980f169aba52: Status 404 returned error can't find the container with id bc340b9d38d9a7179b6f343d19f8d82341216acaf11f98f48b20980f169aba52 Apr 20 20:12:29.309381 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.309362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:12:29.309469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.309416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:12:29.309527 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.309510 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:29.309582 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.309573 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:13:01.309558095 +0000 UTC m=+97.179336709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:12:29.309618 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.309512 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:29.309618 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.309605 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:01.309599643 +0000 UTC m=+97.179378251 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:12:29.410209 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.410187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:12:29.412601 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.412585 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:12:29.420484 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.420470 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:12:29.420533 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:29.420518 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs podName:62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:33.420506467 +0000 UTC m=+129.290285075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs") pod "network-metrics-daemon-c89q8" (UID: "62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2") : secret "metrics-daemon-secret" not found Apr 20 20:12:29.510573 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.510550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:12:29.513307 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.513290 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:12:29.523373 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.523357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:12:29.534625 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.534606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7-kube-api-access-vgdbl\") pod \"network-check-target-qcvnc\" (UID: \"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7\") " pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:12:29.594227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.594179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:12:29.603203 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.603185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:12:29.712360 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:29.712335 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qcvnc"] Apr 20 20:12:29.716236 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:12:29.716210 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f86d95f_49a9_4d51_afd5_7dc67dfd0cd7.slice/crio-9c9738cebe99190226b076fd538f624023ab7a9060188159496c4c74619d8bea WatchSource:0}: Error finding container 9c9738cebe99190226b076fd538f624023ab7a9060188159496c4c74619d8bea: Status 404 returned error can't find the container with id 9c9738cebe99190226b076fd538f624023ab7a9060188159496c4c74619d8bea Apr 20 20:12:30.059566 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:30.059535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qcvnc" event={"ID":"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7","Type":"ContainerStarted","Data":"9c9738cebe99190226b076fd538f624023ab7a9060188159496c4c74619d8bea"} Apr 20 20:12:30.060463 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:30.060445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" event={"ID":"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2","Type":"ContainerStarted","Data":"bc340b9d38d9a7179b6f343d19f8d82341216acaf11f98f48b20980f169aba52"} Apr 20 20:12:34.068961 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:34.068924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qcvnc" event={"ID":"1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7","Type":"ContainerStarted","Data":"0a3204af534e0cda3d2d6bdd93b583237ade3e39990d983c41b27f46fe9dcc80"} Apr 20 20:12:34.069363 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:34.069031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:12:34.070176 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:34.070151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" event={"ID":"ed8439d6-e4a4-4f7e-be2f-7de0797ab4f2","Type":"ContainerStarted","Data":"d7652613735b90293cdfbf9fd4209542526c17bebce48b912630b2ed7d3047c3"} Apr 20 20:12:34.083289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:34.083246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qcvnc" podStartSLOduration=66.732516941 podStartE2EDuration="1m10.083232308s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:12:29.718076512 +0000 UTC m=+65.587855122" lastFinishedPulling="2026-04-20 20:12:33.068791877 +0000 UTC m=+68.938570489" observedRunningTime="2026-04-20 20:12:34.082679243 +0000 UTC m=+69.952457870" watchObservedRunningTime="2026-04-20 20:12:34.083232308 +0000 UTC m=+69.953010942" Apr 20 20:12:34.095626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:34.095588 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d5bb7c8cf-xbpqt" podStartSLOduration=2.3283179929999998 podStartE2EDuration="6.095580944s" podCreationTimestamp="2026-04-20 20:12:28 +0000 UTC" firstStartedPulling="2026-04-20 20:12:29.29690157 +0000 UTC m=+65.166680180" lastFinishedPulling="2026-04-20 20:12:33.064164508 +0000 UTC m=+68.933943131" observedRunningTime="2026-04-20 20:12:34.09516562 +0000 UTC m=+69.964944250" watchObservedRunningTime="2026-04-20 20:12:34.095580944 +0000 UTC m=+69.965359574" Apr 20 20:12:57.359548 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.359428 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j"] Apr 20 20:12:57.363998 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.363976 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-55686795d4-xz2td"] Apr 20 20:12:57.364154 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.364133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.366606 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.366587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.367976 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.367942 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 20:12:57.367976 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.367973 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-c7m5m\"" Apr 20 20:12:57.368198 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.367983 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:12:57.368198 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.367988 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 20:12:57.368198 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.367990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:12:57.369264 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cmd4p\"" Apr 20 20:12:57.369409 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:12:57.369529 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369289 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 20:12:57.369529 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369510 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j"] Apr 20 20:12:57.369632 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 20:12:57.369632 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369354 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 20:12:57.369632 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 20:12:57.369911 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.369895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:12:57.375883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.375861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55686795d4-xz2td"] Apr 20 20:12:57.386559 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.386652 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.386652 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-default-certificate\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.386652 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-stats-auth\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.386652 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmltp\" (UniqueName: \"kubernetes.io/projected/9c9454ed-ac56-495e-8da5-5f99c3919333-kube-api-access-gmltp\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.386809 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzfr\" (UniqueName: \"kubernetes.io/projected/cea479a5-44c4-448f-8f4e-3e373aa915f3-kube-api-access-qpzfr\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.386809 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9c9454ed-ac56-495e-8da5-5f99c3919333-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.386809 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.386763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.487450 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-stats-auth\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.487537 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmltp\" (UniqueName: \"kubernetes.io/projected/9c9454ed-ac56-495e-8da5-5f99c3919333-kube-api-access-gmltp\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.487537 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzfr\" (UniqueName: \"kubernetes.io/projected/cea479a5-44c4-448f-8f4e-3e373aa915f3-kube-api-access-qpzfr\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.487630 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9c9454ed-ac56-495e-8da5-5f99c3919333-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.487677 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.487761 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.487821 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.487783 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:57.487821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.487918 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.487822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-default-certificate\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.487918 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.487853 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:57.987835806 +0000 UTC m=+93.857614417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:57.488027 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.487915 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:12:57.488027 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.487956 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:57.987938302 +0000 UTC m=+93.857716940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:12:57.488027 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.487976 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:57.987967033 +0000 UTC m=+93.857745642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:12:57.488477 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.488449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9c9454ed-ac56-495e-8da5-5f99c3919333-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.490097 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.490076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-stats-auth\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.490273 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.490255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-default-certificate\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.496004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.495981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzfr\" (UniqueName: \"kubernetes.io/projected/cea479a5-44c4-448f-8f4e-3e373aa915f3-kube-api-access-qpzfr\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.496773 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.496752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmltp\" (UniqueName: \"kubernetes.io/projected/9c9454ed-ac56-495e-8da5-5f99c3919333-kube-api-access-gmltp\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.992286 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.992250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:57.992424 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.992309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.992424 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:57.992336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:57.992424 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.992377 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:57.992567 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.992432 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:12:57.992567 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.992438 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:58.992425099 +0000 UTC m=+94.862203710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:57.992567 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.992491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:58.992476218 +0000 UTC m=+94.862254828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:12:57.992567 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:57.992505 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:58.99249699 +0000 UTC m=+94.862275600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:12:58.999990 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:58.999949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:59.000005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:12:59.000051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:59.000157 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:59.000176 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:59.000203 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:01.00018889 +0000 UTC m=+96.869967499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:59.000241 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:01.000223755 +0000 UTC m=+96.870002384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:12:59.000409 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:12:59.000272 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:01.00026237 +0000 UTC m=+96.870041000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:13:01.014369 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:01.014332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:01.014385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:01.014404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.014471 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.014503 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.014504 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:05.0144886 +0000 UTC m=+100.884267210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.014546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:05.014531354 +0000 UTC m=+100.884309964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:01.014763 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.014563 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:05.014553952 +0000 UTC m=+100.884332561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:13:01.317134 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:01.317108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:13:01.317278 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:01.317144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:13:01.317278 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.317240 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:13:01.317278 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.317241 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:13:01.317371 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.317288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert podName:ae28c5cd-450f-42d0-a36c-1e045e920a41 nodeName:}" failed. No retries permitted until 2026-04-20 20:14:05.317274757 +0000 UTC m=+161.187053367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert") pod "ingress-canary-5c5mw" (UID: "ae28c5cd-450f-42d0-a36c-1e045e920a41") : secret "canary-serving-cert" not found Apr 20 20:13:01.317371 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:01.317301 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls podName:75a04807-4a1d-4a9f-9f26-b677e822247a nodeName:}" failed. No retries permitted until 2026-04-20 20:14:05.317295202 +0000 UTC m=+161.187073810 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls") pod "dns-default-gr79r" (UID: "75a04807-4a1d-4a9f-9f26-b677e822247a") : secret "dns-default-metrics-tls" not found Apr 20 20:13:03.429383 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:03.429356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rwqjv_f165a6a5-16a4-48c1-8e9a-9819f7939466/dns-node-resolver/0.log" Apr 20 20:13:03.827003 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:03.826981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hgbbc_2f856405-928f-4e0f-a6a4-b56a19061640/node-ca/0.log" Apr 20 20:13:05.042385 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:05.042349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:05.042402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:05.042426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:05.042495 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:05.042519 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:13.042504949 +0000 UTC m=+108.912283559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:05.042542 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:13.042531052 +0000 UTC m=+108.912309661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:05.042559 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:13:05.042814 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:05.042610 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:13.04259647 +0000 UTC m=+108.912375081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:13:05.074261 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:05.074236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qcvnc" Apr 20 20:13:07.374158 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.374127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf"] Apr 20 20:13:07.376906 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.376888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.379507 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.379481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 20:13:07.379689 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.379562 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 20:13:07.379818 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.379802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 20:13:07.380717 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.380697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:07.380842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.380705 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5pczq\"" Apr 20 20:13:07.383434 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.383360 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf"] Apr 20 20:13:07.461263 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.461235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54aa0b-6d6e-447c-9260-19fd40703cdc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.461361 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.461272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54aa0b-6d6e-447c-9260-19fd40703cdc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.461361 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.461290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8c7\" (UniqueName: \"kubernetes.io/projected/cf54aa0b-6d6e-447c-9260-19fd40703cdc-kube-api-access-kk8c7\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.473379 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.473358 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl"] Apr 20 20:13:07.476124 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.476106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" Apr 20 20:13:07.476913 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.476893 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58"] Apr 20 20:13:07.478874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.478858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-5t5qc\"" Apr 20 20:13:07.478971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.478859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:07.478971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.478908 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 20:13:07.479455 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.479441 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6dwd9"] Apr 20 20:13:07.479575 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.479563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.482011 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.481844 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 20:13:07.482011 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.481897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 20:13:07.482161 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.482095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 20:13:07.482161 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.482104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:07.482370 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.482312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-gdggk\"" Apr 20 20:13:07.482435 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.482387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.484416 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.484400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 20:13:07.484504 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.484465 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 20:13:07.484548 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.484507 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:07.484868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.484854 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 20:13:07.485181 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.485165 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9w5w6\"" Apr 20 20:13:07.486634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.486617 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl"] Apr 20 20:13:07.490088 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.490047 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 20:13:07.490388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.490364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58"] Apr 20 20:13:07.491324 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.491304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6dwd9"] Apr 20 20:13:07.562473 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54aa0b-6d6e-447c-9260-19fd40703cdc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.562473 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8c7\" (UniqueName: \"kubernetes.io/projected/cf54aa0b-6d6e-447c-9260-19fd40703cdc-kube-api-access-kk8c7\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.562615 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-trusted-ca\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.562615 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238f5c92-7bf3-4286-9225-285c572921b5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.562702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-config\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.562702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkmt\" (UniqueName: \"kubernetes.io/projected/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-kube-api-access-hkkmt\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.562702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsw62\" (UniqueName: \"kubernetes.io/projected/238f5c92-7bf3-4286-9225-285c572921b5-kube-api-access-lsw62\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.562847 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f5c92-7bf3-4286-9225-285c572921b5-config\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.562847 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54aa0b-6d6e-447c-9260-19fd40703cdc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.562847 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-serving-cert\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.562950 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.562853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7r6\" (UniqueName: \"kubernetes.io/projected/7f1825d7-23bd-454f-9e8e-d0275aa2f4f9-kube-api-access-pm7r6\") pod \"volume-data-source-validator-7c6cbb6c87-c65sl\" (UID: \"7f1825d7-23bd-454f-9e8e-d0275aa2f4f9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" Apr 20 20:13:07.563250 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.563232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54aa0b-6d6e-447c-9260-19fd40703cdc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.564829 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.564814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54aa0b-6d6e-447c-9260-19fd40703cdc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.570543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.570512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8c7\" (UniqueName: \"kubernetes.io/projected/cf54aa0b-6d6e-447c-9260-19fd40703cdc-kube-api-access-kk8c7\") pod \"kube-storage-version-migrator-operator-6769c5d45-s8wdf\" (UID: \"cf54aa0b-6d6e-447c-9260-19fd40703cdc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.663407 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238f5c92-7bf3-4286-9225-285c572921b5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.663407 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-config\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.663407 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkmt\" (UniqueName: \"kubernetes.io/projected/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-kube-api-access-hkkmt\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.663610 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsw62\" (UniqueName: \"kubernetes.io/projected/238f5c92-7bf3-4286-9225-285c572921b5-kube-api-access-lsw62\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.663610 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f5c92-7bf3-4286-9225-285c572921b5-config\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.663610 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-serving-cert\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.663610 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7r6\" (UniqueName: \"kubernetes.io/projected/7f1825d7-23bd-454f-9e8e-d0275aa2f4f9-kube-api-access-pm7r6\") pod \"volume-data-source-validator-7c6cbb6c87-c65sl\" (UID: \"7f1825d7-23bd-454f-9e8e-d0275aa2f4f9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" Apr 20 20:13:07.663851 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.663617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-trusted-ca\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.664082 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.664051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f5c92-7bf3-4286-9225-285c572921b5-config\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.664332 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.664199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-config\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.664332 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.664321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-trusted-ca\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.665763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.665724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238f5c92-7bf3-4286-9225-285c572921b5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.665954 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.665938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-serving-cert\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.671320 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.671295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7r6\" (UniqueName: \"kubernetes.io/projected/7f1825d7-23bd-454f-9e8e-d0275aa2f4f9-kube-api-access-pm7r6\") pod \"volume-data-source-validator-7c6cbb6c87-c65sl\" (UID: \"7f1825d7-23bd-454f-9e8e-d0275aa2f4f9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" Apr 20 20:13:07.671404 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.671346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsw62\" (UniqueName: \"kubernetes.io/projected/238f5c92-7bf3-4286-9225-285c572921b5-kube-api-access-lsw62\") pod \"service-ca-operator-d6fc45fc5-5xd58\" (UID: \"238f5c92-7bf3-4286-9225-285c572921b5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.671448 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.671420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkmt\" (UniqueName: \"kubernetes.io/projected/f8ab594e-8a42-49c7-bbb9-e82d95d72ee3-kube-api-access-hkkmt\") pod \"console-operator-9d4b6777b-6dwd9\" (UID: \"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3\") " pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.686228 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.686210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" Apr 20 20:13:07.786503 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.786474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" Apr 20 20:13:07.795253 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.795236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" Apr 20 20:13:07.798948 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.798928 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf"] Apr 20 20:13:07.800826 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.800804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:07.802321 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:07.802265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf54aa0b_6d6e_447c_9260_19fd40703cdc.slice/crio-406f9b7e82363a20258684dc576ebe9edb09c87550aae418d80a2ecb7a4fdede WatchSource:0}: Error finding container 406f9b7e82363a20258684dc576ebe9edb09c87550aae418d80a2ecb7a4fdede: Status 404 returned error can't find the container with id 406f9b7e82363a20258684dc576ebe9edb09c87550aae418d80a2ecb7a4fdede Apr 20 20:13:07.921722 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:07.921619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl"] Apr 20 20:13:07.924910 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:07.924860 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f1825d7_23bd_454f_9e8e_d0275aa2f4f9.slice/crio-8a2855f25d44ace0c18d0b401c2787dfa357174ee06eb8bbe37fe4b25fc65018 WatchSource:0}: Error finding container 8a2855f25d44ace0c18d0b401c2787dfa357174ee06eb8bbe37fe4b25fc65018: Status 404 returned error can't find the container with id 8a2855f25d44ace0c18d0b401c2787dfa357174ee06eb8bbe37fe4b25fc65018 Apr 20 20:13:08.136141 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:08.136088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" event={"ID":"7f1825d7-23bd-454f-9e8e-d0275aa2f4f9","Type":"ContainerStarted","Data":"8a2855f25d44ace0c18d0b401c2787dfa357174ee06eb8bbe37fe4b25fc65018"} Apr 20 20:13:08.137340 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:08.137321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" event={"ID":"cf54aa0b-6d6e-447c-9260-19fd40703cdc","Type":"ContainerStarted","Data":"406f9b7e82363a20258684dc576ebe9edb09c87550aae418d80a2ecb7a4fdede"} Apr 20 20:13:08.140905 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:08.140884 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58"] Apr 20 20:13:08.142365 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:08.141870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6dwd9"] Apr 20 20:13:08.145383 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:08.145362 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238f5c92_7bf3_4286_9225_285c572921b5.slice/crio-5d1c9cd15fbcfba49dd8e35ad8e9ea1be59ebe8028c990cedcf33303b194afc1 WatchSource:0}: Error finding container 5d1c9cd15fbcfba49dd8e35ad8e9ea1be59ebe8028c990cedcf33303b194afc1: Status 404 returned error can't find the container with id 5d1c9cd15fbcfba49dd8e35ad8e9ea1be59ebe8028c990cedcf33303b194afc1 Apr 20 20:13:08.145902 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:08.145882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ab594e_8a42_49c7_bbb9_e82d95d72ee3.slice/crio-4863ac8cdbb1ce8d3f14e588a60669d838bfbb655bd8e8f8f74ff7d3d7a4006d WatchSource:0}: Error finding container 4863ac8cdbb1ce8d3f14e588a60669d838bfbb655bd8e8f8f74ff7d3d7a4006d: Status 404 returned error can't find the container with id 4863ac8cdbb1ce8d3f14e588a60669d838bfbb655bd8e8f8f74ff7d3d7a4006d Apr 20 20:13:09.141186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:09.141132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" event={"ID":"238f5c92-7bf3-4286-9225-285c572921b5","Type":"ContainerStarted","Data":"5d1c9cd15fbcfba49dd8e35ad8e9ea1be59ebe8028c990cedcf33303b194afc1"} Apr 20 20:13:09.142622 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:09.142570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" event={"ID":"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3","Type":"ContainerStarted","Data":"4863ac8cdbb1ce8d3f14e588a60669d838bfbb655bd8e8f8f74ff7d3d7a4006d"} Apr 20 20:13:10.146570 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:10.146520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" event={"ID":"7f1825d7-23bd-454f-9e8e-d0275aa2f4f9","Type":"ContainerStarted","Data":"765969d478266ac0fa13ebc2c553c3c3d717b2ec8eaec7e78bf1780be51acab7"} Apr 20 20:13:10.161592 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:10.161547 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c65sl" podStartSLOduration=1.469781003 podStartE2EDuration="3.161532335s" podCreationTimestamp="2026-04-20 20:13:07 +0000 UTC" firstStartedPulling="2026-04-20 20:13:07.926688146 +0000 UTC m=+103.796466755" lastFinishedPulling="2026-04-20 20:13:09.618439472 +0000 UTC m=+105.488218087" observedRunningTime="2026-04-20 20:13:10.160287168 +0000 UTC m=+106.030065802" watchObservedRunningTime="2026-04-20 20:13:10.161532335 +0000 UTC m=+106.031310987" Apr 20 20:13:12.157081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.157048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" event={"ID":"238f5c92-7bf3-4286-9225-285c572921b5","Type":"ContainerStarted","Data":"d5e9bb176c3322a267012bb3f19d350311b322378729464ca8a435c577a2d19f"} Apr 20 20:13:12.158397 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.158369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" event={"ID":"cf54aa0b-6d6e-447c-9260-19fd40703cdc","Type":"ContainerStarted","Data":"83c657f4d77eea1cdfa0b40984b6f8bfda3b0ba304886ac4d95a93b58402ae67"} Apr 20 20:13:12.159965 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.159947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/0.log" Apr 20 20:13:12.160059 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.159983 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" containerID="28ecdb7bd33b080a701c17658802f6944bd18cf67b3d2282f7ae6fc01d175ee6" exitCode=255 Apr 20 20:13:12.160059 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.160029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" event={"ID":"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3","Type":"ContainerDied","Data":"28ecdb7bd33b080a701c17658802f6944bd18cf67b3d2282f7ae6fc01d175ee6"} Apr 20 20:13:12.160209 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.160196 2576 scope.go:117] "RemoveContainer" containerID="28ecdb7bd33b080a701c17658802f6944bd18cf67b3d2282f7ae6fc01d175ee6" Apr 20 20:13:12.170895 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.170850 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" podStartSLOduration=2.09296559 podStartE2EDuration="5.170837355s" podCreationTimestamp="2026-04-20 20:13:07 +0000 UTC" firstStartedPulling="2026-04-20 20:13:08.147098309 +0000 UTC m=+104.016876921" lastFinishedPulling="2026-04-20 20:13:11.224970072 +0000 UTC m=+107.094748686" observedRunningTime="2026-04-20 20:13:12.170070793 +0000 UTC m=+108.039849426" watchObservedRunningTime="2026-04-20 20:13:12.170837355 +0000 UTC m=+108.040615989" Apr 20 20:13:12.206832 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:12.206786 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" podStartSLOduration=1.79341559 podStartE2EDuration="5.206770411s" podCreationTimestamp="2026-04-20 20:13:07 +0000 UTC" firstStartedPulling="2026-04-20 20:13:07.805925194 +0000 UTC m=+103.675703822" lastFinishedPulling="2026-04-20 20:13:11.219280031 +0000 UTC m=+107.089058643" observedRunningTime="2026-04-20 20:13:12.187438085 +0000 UTC m=+108.057216717" watchObservedRunningTime="2026-04-20 20:13:12.206770411 +0000 UTC m=+108.076549037" Apr 20 20:13:13.105978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.105931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:13.105978 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.105981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:13.106174 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.106067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:13.106174 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.106110 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:29.106089782 +0000 UTC m=+124.975868417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : configmap references non-existent config key: service-ca.crt Apr 20 20:13:13.106174 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.106153 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:13.106278 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.106213 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls podName:9c9454ed-ac56-495e-8da5-5f99c3919333 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:29.106199526 +0000 UTC m=+124.975978135 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4qc4j" (UID: "9c9454ed-ac56-495e-8da5-5f99c3919333") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:13:13.106278 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.106152 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:13:13.106278 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.106245 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs podName:cea479a5-44c4-448f-8f4e-3e373aa915f3 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:29.106237932 +0000 UTC m=+124.976016541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs") pod "router-default-55686795d4-xz2td" (UID: "cea479a5-44c4-448f-8f4e-3e373aa915f3") : secret "router-metrics-certs-default" not found Apr 20 20:13:13.163586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.163565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:13:13.163970 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.163954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/0.log" Apr 20 20:13:13.164025 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.163988 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" exitCode=255 Apr 20 20:13:13.164107 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.164085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" event={"ID":"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3","Type":"ContainerDied","Data":"5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3"} Apr 20 20:13:13.164150 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.164130 2576 scope.go:117] "RemoveContainer" containerID="28ecdb7bd33b080a701c17658802f6944bd18cf67b3d2282f7ae6fc01d175ee6" Apr 20 20:13:13.164330 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:13.164304 2576 scope.go:117] "RemoveContainer" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" Apr 20 20:13:13.164514 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:13.164492 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6dwd9_openshift-console-operator(f8ab594e-8a42-49c7-bbb9-e82d95d72ee3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" podUID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" Apr 20 20:13:14.168295 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.168269 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:13:14.168649 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.168612 2576 scope.go:117] "RemoveContainer" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" Apr 20 20:13:14.168829 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:14.168810 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6dwd9_openshift-console-operator(f8ab594e-8a42-49c7-bbb9-e82d95d72ee3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" podUID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" Apr 20 20:13:14.663182 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.663146 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dnc6s"] Apr 20 20:13:14.667297 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.667275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.669680 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.669656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:13:14.669806 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.669661 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:13:14.669806 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.669770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:13:14.669806 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.669796 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:13:14.670837 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.670819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stvsn\"" Apr 20 20:13:14.676644 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.676623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dnc6s"] Apr 20 20:13:14.718943 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.718914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f326e1-f137-4977-9c82-2d9ca1b42e9d-crio-socket\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.719057 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.718968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkk8\" (UniqueName: \"kubernetes.io/projected/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-api-access-5qkk8\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.719057 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.719019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f326e1-f137-4977-9c82-2d9ca1b42e9d-data-volume\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.719156 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.719068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.719156 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.719139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.819876 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.819855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.819973 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.819901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.819973 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.819957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f326e1-f137-4977-9c82-2d9ca1b42e9d-crio-socket\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.820077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.819974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qkk8\" (UniqueName: \"kubernetes.io/projected/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-api-access-5qkk8\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.820077 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:14.819986 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.820077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.820006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f326e1-f137-4977-9c82-2d9ca1b42e9d-data-volume\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.820077 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:14.820047 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls podName:01f326e1-f137-4977-9c82-2d9ca1b42e9d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:15.320028366 +0000 UTC m=+111.189806990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dnc6s" (UID: "01f326e1-f137-4977-9c82-2d9ca1b42e9d") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.820077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.820062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f326e1-f137-4977-9c82-2d9ca1b42e9d-crio-socket\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.820360 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.820282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f326e1-f137-4977-9c82-2d9ca1b42e9d-data-volume\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.820523 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.820502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:14.830247 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:14.830224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qkk8\" (UniqueName: \"kubernetes.io/projected/01f326e1-f137-4977-9c82-2d9ca1b42e9d-kube-api-access-5qkk8\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:15.323790 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:15.323754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:15.324126 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:15.323885 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:15.324126 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:15.323952 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls podName:01f326e1-f137-4977-9c82-2d9ca1b42e9d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:16.323929504 +0000 UTC m=+112.193708113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dnc6s" (UID: "01f326e1-f137-4977-9c82-2d9ca1b42e9d") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:16.332246 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:16.332207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:16.332632 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:16.332362 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:16.332632 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:16.332428 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls podName:01f326e1-f137-4977-9c82-2d9ca1b42e9d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:18.332413188 +0000 UTC m=+114.202191797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dnc6s" (UID: "01f326e1-f137-4977-9c82-2d9ca1b42e9d") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:17.801270 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:17.801234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:17.801644 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:17.801317 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:17.801644 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:17.801590 2576 scope.go:117] "RemoveContainer" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" Apr 20 20:13:17.801774 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:17.801756 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6dwd9_openshift-console-operator(f8ab594e-8a42-49c7-bbb9-e82d95d72ee3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" podUID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" Apr 20 20:13:18.177962 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:18.177893 2576 scope.go:117] "RemoveContainer" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" Apr 20 20:13:18.178091 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:18.178043 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6dwd9_openshift-console-operator(f8ab594e-8a42-49c7-bbb9-e82d95d72ee3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" podUID="f8ab594e-8a42-49c7-bbb9-e82d95d72ee3" Apr 20 20:13:18.348360 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:18.348333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:18.348477 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:18.348437 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:18.348536 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:18.348481 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls podName:01f326e1-f137-4977-9c82-2d9ca1b42e9d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:22.348467424 +0000 UTC m=+118.218246033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dnc6s" (UID: "01f326e1-f137-4977-9c82-2d9ca1b42e9d") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.380622 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:22.380587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:22.381008 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:22.380758 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.381008 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:13:22.380822 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls podName:01f326e1-f137-4977-9c82-2d9ca1b42e9d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:30.380805233 +0000 UTC m=+126.250583865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dnc6s" (UID: "01f326e1-f137-4977-9c82-2d9ca1b42e9d") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:29.130941 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.130904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:29.131517 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.130951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:29.131517 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.131012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:29.131687 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.131664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cea479a5-44c4-448f-8f4e-3e373aa915f3-service-ca-bundle\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:29.133520 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.133496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c9454ed-ac56-495e-8da5-5f99c3919333-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4qc4j\" (UID: \"9c9454ed-ac56-495e-8da5-5f99c3919333\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:29.133577 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.133519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cea479a5-44c4-448f-8f4e-3e373aa915f3-metrics-certs\") pod \"router-default-55686795d4-xz2td\" (UID: \"cea479a5-44c4-448f-8f4e-3e373aa915f3\") " pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:29.177957 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.177936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-c7m5m\"" Apr 20 20:13:29.182751 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.182715 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cmd4p\"" Apr 20 20:13:29.186712 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.186697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" Apr 20 20:13:29.191669 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.191653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:29.312346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.312319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j"] Apr 20 20:13:29.315950 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:29.315923 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9454ed_ac56_495e_8da5_5f99c3919333.slice/crio-3b4c2597bf97a115351cb6003ab014b39c2b11e0303abca53c304b4aaa4bac59 WatchSource:0}: Error finding container 3b4c2597bf97a115351cb6003ab014b39c2b11e0303abca53c304b4aaa4bac59: Status 404 returned error can't find the container with id 3b4c2597bf97a115351cb6003ab014b39c2b11e0303abca53c304b4aaa4bac59 Apr 20 20:13:29.327983 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:29.327958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55686795d4-xz2td"] Apr 20 20:13:29.330372 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:29.330349 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea479a5_44c4_448f_8f4e_3e373aa915f3.slice/crio-79703416687cf9be46a44207707ab63621218b65d5a542b4b6b4b65b0eb378b3 WatchSource:0}: Error finding container 79703416687cf9be46a44207707ab63621218b65d5a542b4b6b4b65b0eb378b3: Status 404 returned error can't find the container with id 79703416687cf9be46a44207707ab63621218b65d5a542b4b6b4b65b0eb378b3 Apr 20 20:13:30.211643 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.211604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55686795d4-xz2td" event={"ID":"cea479a5-44c4-448f-8f4e-3e373aa915f3","Type":"ContainerStarted","Data":"0355bb38a37c11b3a12528f47e67e70082313a91dc70e14efc1f62d014c01f99"} Apr 20 20:13:30.211643 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.211649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55686795d4-xz2td" event={"ID":"cea479a5-44c4-448f-8f4e-3e373aa915f3","Type":"ContainerStarted","Data":"79703416687cf9be46a44207707ab63621218b65d5a542b4b6b4b65b0eb378b3"} Apr 20 20:13:30.212967 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.212937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" event={"ID":"9c9454ed-ac56-495e-8da5-5f99c3919333","Type":"ContainerStarted","Data":"3b4c2597bf97a115351cb6003ab014b39c2b11e0303abca53c304b4aaa4bac59"} Apr 20 20:13:30.230371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.230328 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-55686795d4-xz2td" podStartSLOduration=33.230315909 podStartE2EDuration="33.230315909s" podCreationTimestamp="2026-04-20 20:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:13:30.22955637 +0000 UTC m=+126.099335002" watchObservedRunningTime="2026-04-20 20:13:30.230315909 +0000 UTC m=+126.100094534" Apr 20 20:13:30.441784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.441726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:30.444261 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.444227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f326e1-f137-4977-9c82-2d9ca1b42e9d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dnc6s\" (UID: \"01f326e1-f137-4977-9c82-2d9ca1b42e9d\") " pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:30.579350 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.579315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stvsn\"" Apr 20 20:13:30.587468 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:30.587443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dnc6s" Apr 20 20:13:31.004919 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.004892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dnc6s"] Apr 20 20:13:31.007711 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:31.007685 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f326e1_f137_4977_9c82_2d9ca1b42e9d.slice/crio-63b53c8083e3685ab230911d38b35a283d416282c9098029c97f5bd373d47c4c WatchSource:0}: Error finding container 63b53c8083e3685ab230911d38b35a283d416282c9098029c97f5bd373d47c4c: Status 404 returned error can't find the container with id 63b53c8083e3685ab230911d38b35a283d416282c9098029c97f5bd373d47c4c Apr 20 20:13:31.192633 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.192555 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:31.195091 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.195069 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:31.217301 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.217273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" event={"ID":"9c9454ed-ac56-495e-8da5-5f99c3919333","Type":"ContainerStarted","Data":"bdb73d4db201b29ea00a4b2164f39d011da2e75ef3f9f48e4c23ea55b78e4641"} Apr 20 20:13:31.218749 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.218712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dnc6s" event={"ID":"01f326e1-f137-4977-9c82-2d9ca1b42e9d","Type":"ContainerStarted","Data":"afd87761b7048d1557e7a56e9ec4777ad9d258239c87eb14036884ccb0d6395b"} Apr 20 20:13:31.218852 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.218755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dnc6s" event={"ID":"01f326e1-f137-4977-9c82-2d9ca1b42e9d","Type":"ContainerStarted","Data":"63b53c8083e3685ab230911d38b35a283d416282c9098029c97f5bd373d47c4c"} Apr 20 20:13:31.219046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.219018 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:31.220066 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.220050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-55686795d4-xz2td" Apr 20 20:13:31.237444 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.237409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4qc4j" podStartSLOduration=32.614836477 podStartE2EDuration="34.237399155s" podCreationTimestamp="2026-04-20 20:12:57 +0000 UTC" firstStartedPulling="2026-04-20 20:13:29.317658747 +0000 UTC m=+125.187437358" lastFinishedPulling="2026-04-20 20:13:30.940221412 +0000 UTC m=+126.810000036" observedRunningTime="2026-04-20 20:13:31.236954395 +0000 UTC m=+127.106733026" watchObservedRunningTime="2026-04-20 20:13:31.237399155 +0000 UTC m=+127.107177786" Apr 20 20:13:31.781387 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:31.781364 2576 scope.go:117] "RemoveContainer" containerID="5f9ebef0cae28c762224d6f39db1fe850879f50003aa030bfc8e68b58569e9e3" Apr 20 20:13:32.223534 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.223486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dnc6s" event={"ID":"01f326e1-f137-4977-9c82-2d9ca1b42e9d","Type":"ContainerStarted","Data":"eb18757780a80ff041f662464fb7bf6d2effdd37dc03c8de3bab71482c5132dd"} Apr 20 20:13:32.225124 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.225104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:13:32.225258 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.225159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" event={"ID":"f8ab594e-8a42-49c7-bbb9-e82d95d72ee3","Type":"ContainerStarted","Data":"2c07f694b971c1bfde26ae0de5691e2453738694a49789f142522fcce7cfdee6"} Apr 20 20:13:32.225769 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.225707 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:32.241965 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.241915 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" podStartSLOduration=22.167531069 podStartE2EDuration="25.241899906s" podCreationTimestamp="2026-04-20 20:13:07 +0000 UTC" firstStartedPulling="2026-04-20 20:13:08.147596414 +0000 UTC m=+104.017375023" lastFinishedPulling="2026-04-20 20:13:11.221965251 +0000 UTC m=+107.091743860" observedRunningTime="2026-04-20 20:13:32.240588665 +0000 UTC m=+128.110367487" watchObservedRunningTime="2026-04-20 20:13:32.241899906 +0000 UTC m=+128.111678539" Apr 20 20:13:32.321120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:32.321089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-6dwd9" Apr 20 20:13:33.467823 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:33.467757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:13:33.470035 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:33.470006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2-metrics-certs\") pod \"network-metrics-daemon-c89q8\" (UID: \"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2\") " pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:13:33.500341 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:33.500315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:13:33.507593 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:33.507575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c89q8" Apr 20 20:13:33.618859 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:33.618830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c89q8"] Apr 20 20:13:33.622665 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:33.622635 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a073ac_fa6f_4ebe_a0d3_7ab63d9c9de2.slice/crio-2e9d9387eeac6f3b2331b638fd04c80a505457ea4e5f50a25d28315a75fe0854 WatchSource:0}: Error finding container 2e9d9387eeac6f3b2331b638fd04c80a505457ea4e5f50a25d28315a75fe0854: Status 404 returned error can't find the container with id 2e9d9387eeac6f3b2331b638fd04c80a505457ea4e5f50a25d28315a75fe0854 Apr 20 20:13:34.232453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:34.232417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c89q8" event={"ID":"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2","Type":"ContainerStarted","Data":"2e9d9387eeac6f3b2331b638fd04c80a505457ea4e5f50a25d28315a75fe0854"} Apr 20 20:13:34.236099 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:34.236043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dnc6s" event={"ID":"01f326e1-f137-4977-9c82-2d9ca1b42e9d","Type":"ContainerStarted","Data":"ea29eb5e923354478556a8e45452add4b77279a29a4494c5a94525702c555375"} Apr 20 20:13:34.254619 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:34.254577 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dnc6s" podStartSLOduration=18.170859785 podStartE2EDuration="20.254563327s" podCreationTimestamp="2026-04-20 20:13:14 +0000 UTC" firstStartedPulling="2026-04-20 20:13:31.084927112 +0000 UTC m=+126.954705722" lastFinishedPulling="2026-04-20 20:13:33.168630654 +0000 UTC m=+129.038409264" observedRunningTime="2026-04-20 20:13:34.253185819 +0000 UTC m=+130.122964451" watchObservedRunningTime="2026-04-20 20:13:34.254563327 +0000 UTC m=+130.124341957" Apr 20 20:13:35.240177 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:35.240137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c89q8" event={"ID":"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2","Type":"ContainerStarted","Data":"2087f454ed83d3f1b806daac400270801ade15e7ce8c5a4c276f3e5ab0ce9c1e"} Apr 20 20:13:35.240177 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:35.240179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c89q8" event={"ID":"62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2","Type":"ContainerStarted","Data":"b8b94621cea884a27bc767562e0190aff4afe799a3d85ee696f5b877d1a317c6"} Apr 20 20:13:35.255662 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:35.255601 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c89q8" podStartSLOduration=130.240636308 podStartE2EDuration="2m11.25558467s" podCreationTimestamp="2026-04-20 20:11:24 +0000 UTC" firstStartedPulling="2026-04-20 20:13:33.624654446 +0000 UTC m=+129.494433055" lastFinishedPulling="2026-04-20 20:13:34.63960279 +0000 UTC m=+130.509381417" observedRunningTime="2026-04-20 20:13:35.254641469 +0000 UTC m=+131.124420099" watchObservedRunningTime="2026-04-20 20:13:35.25558467 +0000 UTC m=+131.125363302" Apr 20 20:13:38.018876 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.018840 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j"] Apr 20 20:13:38.021764 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.021710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" Apr 20 20:13:38.023259 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.023222 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lqqbc"] Apr 20 20:13:38.024634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.024610 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5dhvp\"" Apr 20 20:13:38.025940 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.025868 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g897h"] Apr 20 20:13:38.026039 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.026005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:38.028545 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.028524 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:13:38.028894 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.028568 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:13:38.028894 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.028691 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ttsbm\"" Apr 20 20:13:38.028894 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.028754 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw"] Apr 20 20:13:38.029053 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.028987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.031812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.031790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:38.032436 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.032415 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 20:13:38.032584 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.032563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 20:13:38.032679 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.032510 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2kzch\"" Apr 20 20:13:38.033202 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.033162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j"] Apr 20 20:13:38.034147 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.034126 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 20:13:38.034865 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.034557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-w45v4\"" Apr 20 20:13:38.046723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.046699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lqqbc"] Apr 20 20:13:38.075895 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.070305 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g897h"] Apr 20 20:13:38.075895 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.070343 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw"] Apr 20 20:13:38.102785 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.102764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfhq\" (UniqueName: \"kubernetes.io/projected/2b6fdc39-f579-4589-9eb1-5edb0e8bf56b-kube-api-access-ctfhq\") pod \"downloads-6bcc868b7-lqqbc\" (UID: \"2b6fdc39-f579-4589-9eb1-5edb0e8bf56b\") " pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:38.102883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.102802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbvk\" (UniqueName: \"kubernetes.io/projected/448c06af-22c3-45bf-8cf3-2a424611877b-kube-api-access-rrbvk\") pod \"network-check-source-8894fc9bd-pts4j\" (UID: \"448c06af-22c3-45bf-8cf3-2a424611877b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" Apr 20 20:13:38.203886 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.203851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e85e520-969d-4652-b000-05191ade92fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2n7mw\" (UID: \"1e85e520-969d-4652-b000-05191ade92fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:38.204037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.203900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e41d9a3d-07e1-4603-8db1-ef454b3aa769-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.204037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.203973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfhq\" (UniqueName: \"kubernetes.io/projected/2b6fdc39-f579-4589-9eb1-5edb0e8bf56b-kube-api-access-ctfhq\") pod \"downloads-6bcc868b7-lqqbc\" (UID: \"2b6fdc39-f579-4589-9eb1-5edb0e8bf56b\") " pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:38.204037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.204008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbvk\" (UniqueName: \"kubernetes.io/projected/448c06af-22c3-45bf-8cf3-2a424611877b-kube-api-access-rrbvk\") pod \"network-check-source-8894fc9bd-pts4j\" (UID: \"448c06af-22c3-45bf-8cf3-2a424611877b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" Apr 20 20:13:38.204037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.204030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e41d9a3d-07e1-4603-8db1-ef454b3aa769-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.212763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.212709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfhq\" (UniqueName: \"kubernetes.io/projected/2b6fdc39-f579-4589-9eb1-5edb0e8bf56b-kube-api-access-ctfhq\") pod \"downloads-6bcc868b7-lqqbc\" (UID: \"2b6fdc39-f579-4589-9eb1-5edb0e8bf56b\") " pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:38.212763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.212723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbvk\" (UniqueName: \"kubernetes.io/projected/448c06af-22c3-45bf-8cf3-2a424611877b-kube-api-access-rrbvk\") pod \"network-check-source-8894fc9bd-pts4j\" (UID: \"448c06af-22c3-45bf-8cf3-2a424611877b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" Apr 20 20:13:38.304838 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.304814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e41d9a3d-07e1-4603-8db1-ef454b3aa769-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.304937 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.304861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e41d9a3d-07e1-4603-8db1-ef454b3aa769-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.305032 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.305014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e85e520-969d-4652-b000-05191ade92fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2n7mw\" (UID: \"1e85e520-969d-4652-b000-05191ade92fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:38.305359 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.305341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e41d9a3d-07e1-4603-8db1-ef454b3aa769-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.307140 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.307124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e41d9a3d-07e1-4603-8db1-ef454b3aa769-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g897h\" (UID: \"e41d9a3d-07e1-4603-8db1-ef454b3aa769\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.307349 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.307333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1e85e520-969d-4652-b000-05191ade92fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2n7mw\" (UID: \"1e85e520-969d-4652-b000-05191ade92fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:38.334826 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.334810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" Apr 20 20:13:38.371139 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.371106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:38.385308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.385284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" Apr 20 20:13:38.391373 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.390979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:38.466173 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.466124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j"] Apr 20 20:13:38.471362 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:38.471299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448c06af_22c3_45bf_8cf3_2a424611877b.slice/crio-4ad8f6a6feb53e6f35f7ca722e9e9e1560a6a312398b88a45de7e4f5bb230fe6 WatchSource:0}: Error finding container 4ad8f6a6feb53e6f35f7ca722e9e9e1560a6a312398b88a45de7e4f5bb230fe6: Status 404 returned error can't find the container with id 4ad8f6a6feb53e6f35f7ca722e9e9e1560a6a312398b88a45de7e4f5bb230fe6 Apr 20 20:13:38.525837 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.525797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lqqbc"] Apr 20 20:13:38.540790 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.540761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g897h"] Apr 20 20:13:38.541136 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:38.541100 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6fdc39_f579_4589_9eb1_5edb0e8bf56b.slice/crio-c44cee16ffa1e01e4768ddc8d97b409d0d801b2896fa0a520e76e25e7d4dce1d WatchSource:0}: Error finding container c44cee16ffa1e01e4768ddc8d97b409d0d801b2896fa0a520e76e25e7d4dce1d: Status 404 returned error can't find the container with id c44cee16ffa1e01e4768ddc8d97b409d0d801b2896fa0a520e76e25e7d4dce1d Apr 20 20:13:38.543277 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:38.543116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode41d9a3d_07e1_4603_8db1_ef454b3aa769.slice/crio-4f23ee8be87322598c06bf3bf02bc5b2e6fade6f9d5e72145c416041ceee7212 WatchSource:0}: Error finding container 4f23ee8be87322598c06bf3bf02bc5b2e6fade6f9d5e72145c416041ceee7212: Status 404 returned error can't find the container with id 4f23ee8be87322598c06bf3bf02bc5b2e6fade6f9d5e72145c416041ceee7212 Apr 20 20:13:38.558675 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:38.558621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw"] Apr 20 20:13:38.561015 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:38.560995 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e85e520_969d_4652_b000_05191ade92fc.slice/crio-a34f21d6189351351b2ddfc2b3748d667e9ad2f24a08fb88f262242922487d0d WatchSource:0}: Error finding container a34f21d6189351351b2ddfc2b3748d667e9ad2f24a08fb88f262242922487d0d: Status 404 returned error can't find the container with id a34f21d6189351351b2ddfc2b3748d667e9ad2f24a08fb88f262242922487d0d Apr 20 20:13:39.254028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.253858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" event={"ID":"1e85e520-969d-4652-b000-05191ade92fc","Type":"ContainerStarted","Data":"a34f21d6189351351b2ddfc2b3748d667e9ad2f24a08fb88f262242922487d0d"} Apr 20 20:13:39.254969 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.254934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" event={"ID":"e41d9a3d-07e1-4603-8db1-ef454b3aa769","Type":"ContainerStarted","Data":"4f23ee8be87322598c06bf3bf02bc5b2e6fade6f9d5e72145c416041ceee7212"} Apr 20 20:13:39.256661 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.256591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" event={"ID":"448c06af-22c3-45bf-8cf3-2a424611877b","Type":"ContainerStarted","Data":"c68fd4e282a1680d53ddf9ed52113bccada8b4e4b5610c1759e927bd3e610b99"} Apr 20 20:13:39.256661 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.256625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" event={"ID":"448c06af-22c3-45bf-8cf3-2a424611877b","Type":"ContainerStarted","Data":"4ad8f6a6feb53e6f35f7ca722e9e9e1560a6a312398b88a45de7e4f5bb230fe6"} Apr 20 20:13:39.257957 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.257932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lqqbc" event={"ID":"2b6fdc39-f579-4589-9eb1-5edb0e8bf56b","Type":"ContainerStarted","Data":"c44cee16ffa1e01e4768ddc8d97b409d0d801b2896fa0a520e76e25e7d4dce1d"} Apr 20 20:13:39.271947 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:39.271902 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pts4j" podStartSLOduration=1.271885071 podStartE2EDuration="1.271885071s" podCreationTimestamp="2026-04-20 20:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:13:39.270206016 +0000 UTC m=+135.139984646" watchObservedRunningTime="2026-04-20 20:13:39.271885071 +0000 UTC m=+135.141663703" Apr 20 20:13:41.264762 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:41.264712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" event={"ID":"1e85e520-969d-4652-b000-05191ade92fc","Type":"ContainerStarted","Data":"0211e354e5dad465a7f1fad48411c2d0c227dd20f8459e64eb5c457a906fcf25"} Apr 20 20:13:41.266207 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:41.266181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" event={"ID":"e41d9a3d-07e1-4603-8db1-ef454b3aa769","Type":"ContainerStarted","Data":"742855bc34bc596dd2f85792d3b4ff76ab5b749b9ed1f47936f0ec70c343c2cb"} Apr 20 20:13:41.281668 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:41.281614 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" podStartSLOduration=1.563167448 podStartE2EDuration="3.281599792s" podCreationTimestamp="2026-04-20 20:13:38 +0000 UTC" firstStartedPulling="2026-04-20 20:13:38.562560273 +0000 UTC m=+134.432338881" lastFinishedPulling="2026-04-20 20:13:40.28099261 +0000 UTC m=+136.150771225" observedRunningTime="2026-04-20 20:13:41.2808442 +0000 UTC m=+137.150622829" watchObservedRunningTime="2026-04-20 20:13:41.281599792 +0000 UTC m=+137.151378423" Apr 20 20:13:41.295955 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:41.295903 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g897h" podStartSLOduration=1.56232327 podStartE2EDuration="3.295884876s" podCreationTimestamp="2026-04-20 20:13:38 +0000 UTC" firstStartedPulling="2026-04-20 20:13:38.54494411 +0000 UTC m=+134.414722718" lastFinishedPulling="2026-04-20 20:13:40.2785057 +0000 UTC m=+136.148284324" observedRunningTime="2026-04-20 20:13:41.295031069 +0000 UTC m=+137.164809699" watchObservedRunningTime="2026-04-20 20:13:41.295884876 +0000 UTC m=+137.165663509" Apr 20 20:13:42.269717 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.269673 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:42.274622 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.274600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2n7mw" Apr 20 20:13:42.479047 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.479018 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tmzg4"] Apr 20 20:13:42.482529 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.482511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.485449 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.485425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 20:13:42.485449 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.485427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 20:13:42.485603 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.485544 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:13:42.485777 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.485729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6nt8c\"" Apr 20 20:13:42.489261 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.489239 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tmzg4"] Apr 20 20:13:42.639422 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.639377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.639586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.639427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n424v\" (UniqueName: \"kubernetes.io/projected/9148d960-a444-4dba-a7d7-10703c00c120-kube-api-access-n424v\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.639640 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.639581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9148d960-a444-4dba-a7d7-10703c00c120-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.639684 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.639662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.740719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.740684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9148d960-a444-4dba-a7d7-10703c00c120-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.740900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.740768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.740900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.740807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.740900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.740834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n424v\" (UniqueName: \"kubernetes.io/projected/9148d960-a444-4dba-a7d7-10703c00c120-kube-api-access-n424v\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.742191 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.742151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9148d960-a444-4dba-a7d7-10703c00c120-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.743681 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.743657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.743681 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.743669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9148d960-a444-4dba-a7d7-10703c00c120-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.748462 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.748443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n424v\" (UniqueName: \"kubernetes.io/projected/9148d960-a444-4dba-a7d7-10703c00c120-kube-api-access-n424v\") pod \"prometheus-operator-5676c8c784-tmzg4\" (UID: \"9148d960-a444-4dba-a7d7-10703c00c120\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.794246 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.794224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" Apr 20 20:13:42.930525 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:42.930496 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tmzg4"] Apr 20 20:13:42.933338 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:42.933308 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9148d960_a444_4dba_a7d7_10703c00c120.slice/crio-babc04400488838534fddb8c2654f9c2a8789b6ad3ac1ee3bad4f8a497e2ef76 WatchSource:0}: Error finding container babc04400488838534fddb8c2654f9c2a8789b6ad3ac1ee3bad4f8a497e2ef76: Status 404 returned error can't find the container with id babc04400488838534fddb8c2654f9c2a8789b6ad3ac1ee3bad4f8a497e2ef76 Apr 20 20:13:43.273685 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:43.273605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" event={"ID":"9148d960-a444-4dba-a7d7-10703c00c120","Type":"ContainerStarted","Data":"babc04400488838534fddb8c2654f9c2a8789b6ad3ac1ee3bad4f8a497e2ef76"} Apr 20 20:13:45.281177 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:45.281136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" event={"ID":"9148d960-a444-4dba-a7d7-10703c00c120","Type":"ContainerStarted","Data":"7000273cb79147137c2dca1add1712a17e87d5c0d92942061040c8fabda8f895"} Apr 20 20:13:45.281601 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:45.281184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" event={"ID":"9148d960-a444-4dba-a7d7-10703c00c120","Type":"ContainerStarted","Data":"4320f9c4f94d94047ca3ecbf18de5d9f626420f00ccacad10b62fd40085da907"} Apr 20 20:13:45.297877 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:45.297821 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-tmzg4" podStartSLOduration=1.331187901 podStartE2EDuration="3.297803258s" podCreationTimestamp="2026-04-20 20:13:42 +0000 UTC" firstStartedPulling="2026-04-20 20:13:42.935866185 +0000 UTC m=+138.805644794" lastFinishedPulling="2026-04-20 20:13:44.902481532 +0000 UTC m=+140.772260151" observedRunningTime="2026-04-20 20:13:45.296229146 +0000 UTC m=+141.166007779" watchObservedRunningTime="2026-04-20 20:13:45.297803258 +0000 UTC m=+141.167581890" Apr 20 20:13:46.822103 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.822069 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx"] Apr 20 20:13:46.825860 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.825833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:46.828716 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.828690 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:13:46.828958 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.828939 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 20:13:46.829046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.829012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-jghgw\"" Apr 20 20:13:46.834465 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.833982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx"] Apr 20 20:13:46.838529 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.838511 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tc7w5"] Apr 20 20:13:46.841410 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.841391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.843947 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.843928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c8f4q\"" Apr 20 20:13:46.844198 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.844183 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:13:46.844956 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.844384 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:13:46.844956 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.844565 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:13:46.975135 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975267 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmtx\" (UniqueName: \"kubernetes.io/projected/e142d0de-4827-4c18-a0da-8d75a43bb5a1-kube-api-access-8nmtx\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:46.975267 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-sys\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:46.975389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-textfile\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-wtmp\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975491 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-root\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975491 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-accelerators-collector-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975491 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkdr\" (UniqueName: \"kubernetes.io/projected/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-kube-api-access-2qkdr\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975491 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-tls\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975776 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e142d0de-4827-4c18-a0da-8d75a43bb5a1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:46.975776 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-metrics-client-ca\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:46.975776 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:46.975612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.076227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-accelerators-collector-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkdr\" (UniqueName: \"kubernetes.io/projected/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-kube-api-access-2qkdr\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-tls\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e142d0de-4827-4c18-a0da-8d75a43bb5a1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-metrics-client-ca\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmtx\" (UniqueName: \"kubernetes.io/projected/e142d0de-4827-4c18-a0da-8d75a43bb5a1-kube-api-access-8nmtx\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-sys\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.076831 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-textfile\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076831 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-wtmp\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076831 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-root\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.076831 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-root\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077027 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.076963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-sys\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077099 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.077072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-metrics-client-ca\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077219 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.077113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-accelerators-collector-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.077270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-wtmp\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077395 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.077357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-textfile\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.077478 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.077444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e142d0de-4827-4c18-a0da-8d75a43bb5a1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.080138 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.080113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.080248 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.080118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.080248 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.080202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-node-exporter-tls\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.080897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.080870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e142d0de-4827-4c18-a0da-8d75a43bb5a1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.086817 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.086795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkdr\" (UniqueName: \"kubernetes.io/projected/2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9-kube-api-access-2qkdr\") pod \"node-exporter-tc7w5\" (UID: \"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9\") " pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.086817 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.086806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmtx\" (UniqueName: \"kubernetes.io/projected/e142d0de-4827-4c18-a0da-8d75a43bb5a1-kube-api-access-8nmtx\") pod \"openshift-state-metrics-9d44df66c-b8xbx\" (UID: \"e142d0de-4827-4c18-a0da-8d75a43bb5a1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.137781 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.137756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" Apr 20 20:13:47.151676 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.151647 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tc7w5" Apr 20 20:13:47.161251 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:47.161222 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9255d8_9ff4_4cf9_840c_1d8a19b15ed9.slice/crio-e8871c22eebe51dff51092a8009469bbc2f5bb1970d7d6cc8b1163de27043a80 WatchSource:0}: Error finding container e8871c22eebe51dff51092a8009469bbc2f5bb1970d7d6cc8b1163de27043a80: Status 404 returned error can't find the container with id e8871c22eebe51dff51092a8009469bbc2f5bb1970d7d6cc8b1163de27043a80 Apr 20 20:13:47.267532 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.267510 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx"] Apr 20 20:13:47.270648 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:47.270618 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode142d0de_4827_4c18_a0da_8d75a43bb5a1.slice/crio-df41dfb3a359e949f686ea224270f7123e046dc9ba3cbd0b1fedcd5992c69b1c WatchSource:0}: Error finding container df41dfb3a359e949f686ea224270f7123e046dc9ba3cbd0b1fedcd5992c69b1c: Status 404 returned error can't find the container with id df41dfb3a359e949f686ea224270f7123e046dc9ba3cbd0b1fedcd5992c69b1c Apr 20 20:13:47.287221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.287189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" event={"ID":"e142d0de-4827-4c18-a0da-8d75a43bb5a1","Type":"ContainerStarted","Data":"df41dfb3a359e949f686ea224270f7123e046dc9ba3cbd0b1fedcd5992c69b1c"} Apr 20 20:13:47.288342 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:47.288316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tc7w5" event={"ID":"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9","Type":"ContainerStarted","Data":"e8871c22eebe51dff51092a8009469bbc2f5bb1970d7d6cc8b1163de27043a80"} Apr 20 20:13:48.293815 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:48.293565 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9" containerID="d21f54a0e32e20cb592339524edce79eb149d821cd0b5a0d7c8b9625188598f8" exitCode=0 Apr 20 20:13:48.293815 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:48.293640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tc7w5" event={"ID":"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9","Type":"ContainerDied","Data":"d21f54a0e32e20cb592339524edce79eb149d821cd0b5a0d7c8b9625188598f8"} Apr 20 20:13:48.296129 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:48.296101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" event={"ID":"e142d0de-4827-4c18-a0da-8d75a43bb5a1","Type":"ContainerStarted","Data":"9b10055ebe14f89cc9fe0f8acba3c5f73437052cdb7f305721eed51e6e3c4ea0"} Apr 20 20:13:48.296226 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:48.296137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" event={"ID":"e142d0de-4827-4c18-a0da-8d75a43bb5a1","Type":"ContainerStarted","Data":"bab37176295b3e5ae0492d6fddc1e20ca7a801fc675a07bc318a301de01d6da4"} Apr 20 20:13:56.321148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.321101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lqqbc" event={"ID":"2b6fdc39-f579-4589-9eb1-5edb0e8bf56b","Type":"ContainerStarted","Data":"3a45b09329f74b96a965fde9e7cafb4bfa45939004625b0cc95bcad15acec978"} Apr 20 20:13:56.321814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.321788 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:56.323596 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.323441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tc7w5" event={"ID":"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9","Type":"ContainerStarted","Data":"81aa298533e0bd400fcf74a798b960e3b2d5f7bc07e2a8b244600dd0511bec37"} Apr 20 20:13:56.323596 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.323477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tc7w5" event={"ID":"2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9","Type":"ContainerStarted","Data":"f34ac05ab04fb30038bb5cb718d38baa45b4a9d255c60e3db46d552b6d5d5cef"} Apr 20 20:13:56.325692 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.325647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" event={"ID":"e142d0de-4827-4c18-a0da-8d75a43bb5a1","Type":"ContainerStarted","Data":"272cb0c6943e8252a94dccb927733f6e694a5e83e27b5bf9c5d801165d7404aa"} Apr 20 20:13:56.340595 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.340572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lqqbc" Apr 20 20:13:56.346136 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.346091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lqqbc" podStartSLOduration=1.252309774 podStartE2EDuration="18.34607547s" podCreationTimestamp="2026-04-20 20:13:38 +0000 UTC" firstStartedPulling="2026-04-20 20:13:38.543470016 +0000 UTC m=+134.413248627" lastFinishedPulling="2026-04-20 20:13:55.637235714 +0000 UTC m=+151.507014323" observedRunningTime="2026-04-20 20:13:56.345486661 +0000 UTC m=+152.215265292" watchObservedRunningTime="2026-04-20 20:13:56.34607547 +0000 UTC m=+152.215854102" Apr 20 20:13:56.399597 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.399541 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b8xbx" podStartSLOduration=2.203516273 podStartE2EDuration="10.39952262s" podCreationTimestamp="2026-04-20 20:13:46 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.40311003 +0000 UTC m=+143.272888640" lastFinishedPulling="2026-04-20 20:13:55.599116378 +0000 UTC m=+151.468894987" observedRunningTime="2026-04-20 20:13:56.398333279 +0000 UTC m=+152.268111911" watchObservedRunningTime="2026-04-20 20:13:56.39952262 +0000 UTC m=+152.269301252" Apr 20 20:13:56.417124 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:56.417073 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tc7w5" podStartSLOduration=9.617843399 podStartE2EDuration="10.417060101s" podCreationTimestamp="2026-04-20 20:13:46 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.163293716 +0000 UTC m=+143.033072326" lastFinishedPulling="2026-04-20 20:13:47.962510395 +0000 UTC m=+143.832289028" observedRunningTime="2026-04-20 20:13:56.415283479 +0000 UTC m=+152.285062112" watchObservedRunningTime="2026-04-20 20:13:56.417060101 +0000 UTC m=+152.286838731" Apr 20 20:13:57.046586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.046544 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:13:57.051646 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.051618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.054263 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.054232 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:13:57.054388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.054238 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:13:57.054388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.054245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:13:57.055863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.055842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:13:57.056296 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.056276 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m57vs\"" Apr 20 20:13:57.056391 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.056328 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:13:57.061647 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.061618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:13:57.065520 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.065494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066013 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.065994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066136 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.066124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066255 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.066243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066337 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.066325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066434 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.066423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98l9t\" (UniqueName: \"kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.066515 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.066504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.072288 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.071993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:13:57.167353 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167533 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167533 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167533 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167533 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98l9t\" (UniqueName: \"kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167533 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.167820 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.167570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.168222 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.168192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.168342 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.168265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.168484 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.168461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.170462 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.170436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.170566 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.170472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.170566 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.170498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.175874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.175851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98l9t\" (UniqueName: \"kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t\") pod \"console-fc7d876c-glrb4\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.371367 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.371288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:13:57.517568 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.517541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:13:57.520683 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:57.520649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f75af20_20e1_4945_8cc8_73adfabcdfa3.slice/crio-40ccc8d74395891f535281ec3e3b8c788c316398788c39533e563c8aa856e2b3 WatchSource:0}: Error finding container 40ccc8d74395891f535281ec3e3b8c788c316398788c39533e563c8aa856e2b3: Status 404 returned error can't find the container with id 40ccc8d74395891f535281ec3e3b8c788c316398788c39533e563c8aa856e2b3 Apr 20 20:13:57.648964 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.648882 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:13:57.685044 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.685017 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:13:57.685207 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.685131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773233 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrcn\" (UniqueName: \"kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.773635 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.773403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.874692 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.874883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.874883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.874883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875012 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875012 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.874993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrcn\" (UniqueName: \"kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875123 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.875068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875679 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.875653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875833 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.875717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.875910 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.875831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.876407 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.876379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.877692 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.877668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.877810 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.877760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.883090 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.883069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrcn\" (UniqueName: \"kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn\") pod \"console-5c584cd6d7-wbfbq\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:57.997309 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:57.997230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:13:58.162310 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:58.162273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:13:58.168321 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:13:58.168291 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30583cad_9ccc_486f_8d3c_0975939e3264.slice/crio-29d575de39054f8c4eb3bd47769e1f3659cf4755e04b68ebabb027a69ee1b80d WatchSource:0}: Error finding container 29d575de39054f8c4eb3bd47769e1f3659cf4755e04b68ebabb027a69ee1b80d: Status 404 returned error can't find the container with id 29d575de39054f8c4eb3bd47769e1f3659cf4755e04b68ebabb027a69ee1b80d Apr 20 20:13:58.336553 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:58.336497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7d876c-glrb4" event={"ID":"2f75af20-20e1-4945-8cc8-73adfabcdfa3","Type":"ContainerStarted","Data":"40ccc8d74395891f535281ec3e3b8c788c316398788c39533e563c8aa856e2b3"} Apr 20 20:13:58.338088 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:13:58.338040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c584cd6d7-wbfbq" event={"ID":"30583cad-9ccc-486f-8d3c-0975939e3264","Type":"ContainerStarted","Data":"29d575de39054f8c4eb3bd47769e1f3659cf4755e04b68ebabb027a69ee1b80d"} Apr 20 20:14:00.502015 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:14:00.501926 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gr79r" podUID="75a04807-4a1d-4a9f-9f26-b677e822247a" Apr 20 20:14:00.510634 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:14:00.510595 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5c5mw" podUID="ae28c5cd-450f-42d0-a36c-1e045e920a41" Apr 20 20:14:01.350670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:01.350631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:01.350883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:01.350631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:14:02.355930 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:02.355890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7d876c-glrb4" event={"ID":"2f75af20-20e1-4945-8cc8-73adfabcdfa3","Type":"ContainerStarted","Data":"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0"} Apr 20 20:14:02.357549 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:02.357519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c584cd6d7-wbfbq" event={"ID":"30583cad-9ccc-486f-8d3c-0975939e3264","Type":"ContainerStarted","Data":"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6"} Apr 20 20:14:02.373714 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:02.373656 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fc7d876c-glrb4" podStartSLOduration=1.2841757010000001 podStartE2EDuration="5.373644771s" podCreationTimestamp="2026-04-20 20:13:57 +0000 UTC" firstStartedPulling="2026-04-20 20:13:57.522803679 +0000 UTC m=+153.392582288" lastFinishedPulling="2026-04-20 20:14:01.612272748 +0000 UTC m=+157.482051358" observedRunningTime="2026-04-20 20:14:02.37188292 +0000 UTC m=+158.241661556" watchObservedRunningTime="2026-04-20 20:14:02.373644771 +0000 UTC m=+158.243423402" Apr 20 20:14:02.388598 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:02.388549 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c584cd6d7-wbfbq" podStartSLOduration=1.853664926 podStartE2EDuration="5.388537392s" podCreationTimestamp="2026-04-20 20:13:57 +0000 UTC" firstStartedPulling="2026-04-20 20:13:58.17127231 +0000 UTC m=+154.041050933" lastFinishedPulling="2026-04-20 20:14:01.706144788 +0000 UTC m=+157.575923399" observedRunningTime="2026-04-20 20:14:02.386931347 +0000 UTC m=+158.256709979" watchObservedRunningTime="2026-04-20 20:14:02.388537392 +0000 UTC m=+158.258316024" Apr 20 20:14:05.349118 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.349078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:14:05.349621 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.349130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:05.351972 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.351936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a04807-4a1d-4a9f-9f26-b677e822247a-metrics-tls\") pod \"dns-default-gr79r\" (UID: \"75a04807-4a1d-4a9f-9f26-b677e822247a\") " pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:05.352098 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.351980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae28c5cd-450f-42d0-a36c-1e045e920a41-cert\") pod \"ingress-canary-5c5mw\" (UID: \"ae28c5cd-450f-42d0-a36c-1e045e920a41\") " pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:14:05.554307 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.554281 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:14:05.555333 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.555309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:14:05.562409 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.562386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5c5mw" Apr 20 20:14:05.562510 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.562421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:05.716493 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.716391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gr79r"] Apr 20 20:14:05.735635 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:05.735608 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5c5mw"] Apr 20 20:14:05.738937 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:14:05.738906 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae28c5cd_450f_42d0_a36c_1e045e920a41.slice/crio-73fa8f495b4a4a67db8e7e42b10f248ffec794ba659c6a67b4dd9568fca75f5a WatchSource:0}: Error finding container 73fa8f495b4a4a67db8e7e42b10f248ffec794ba659c6a67b4dd9568fca75f5a: Status 404 returned error can't find the container with id 73fa8f495b4a4a67db8e7e42b10f248ffec794ba659c6a67b4dd9568fca75f5a Apr 20 20:14:06.371395 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:06.371358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr79r" event={"ID":"75a04807-4a1d-4a9f-9f26-b677e822247a","Type":"ContainerStarted","Data":"6d4cedd44e01cb34369db4f4e18656dff3f6292bec6b402e2df810197056d10d"} Apr 20 20:14:06.372680 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:06.372645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5c5mw" event={"ID":"ae28c5cd-450f-42d0-a36c-1e045e920a41","Type":"ContainerStarted","Data":"73fa8f495b4a4a67db8e7e42b10f248ffec794ba659c6a67b4dd9568fca75f5a"} Apr 20 20:14:07.371956 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:07.371874 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:07.372306 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:07.371953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:07.376758 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:07.376714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:07.997530 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:07.997493 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:14:07.997530 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:07.997531 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:14:08.002349 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:08.002330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:14:08.389856 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:08.389832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:08.390219 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:08.390008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:14:08.455125 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:08.453013 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:14:09.382050 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.382007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr79r" event={"ID":"75a04807-4a1d-4a9f-9f26-b677e822247a","Type":"ContainerStarted","Data":"30396df7daca0e3bdc41be605d4c2c4629ac107a04d305158407a62c3a192309"} Apr 20 20:14:09.382050 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.382050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr79r" event={"ID":"75a04807-4a1d-4a9f-9f26-b677e822247a","Type":"ContainerStarted","Data":"9732a449bbf1831ee8260e35aa7b23f7192d417f2635f3eb3db931ae39f8c0bc"} Apr 20 20:14:09.382281 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.382091 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:09.383391 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.383365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5c5mw" event={"ID":"ae28c5cd-450f-42d0-a36c-1e045e920a41","Type":"ContainerStarted","Data":"3a3a46dd9b99e3350ba40e61318583f50768a345b52c2a6df36698e9757edec2"} Apr 20 20:14:09.402227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.402192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gr79r" podStartSLOduration=129.767255421 podStartE2EDuration="2m12.402181871s" podCreationTimestamp="2026-04-20 20:11:57 +0000 UTC" firstStartedPulling="2026-04-20 20:14:05.721976183 +0000 UTC m=+161.591754792" lastFinishedPulling="2026-04-20 20:14:08.356902624 +0000 UTC m=+164.226681242" observedRunningTime="2026-04-20 20:14:09.400867971 +0000 UTC m=+165.270646618" watchObservedRunningTime="2026-04-20 20:14:09.402181871 +0000 UTC m=+165.271960496" Apr 20 20:14:09.418470 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:09.418430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5c5mw" podStartSLOduration=129.798202506 podStartE2EDuration="2m12.418419822s" podCreationTimestamp="2026-04-20 20:11:57 +0000 UTC" firstStartedPulling="2026-04-20 20:14:05.740879191 +0000 UTC m=+161.610657814" lastFinishedPulling="2026-04-20 20:14:08.36109652 +0000 UTC m=+164.230875130" observedRunningTime="2026-04-20 20:14:09.416694711 +0000 UTC m=+165.286473355" watchObservedRunningTime="2026-04-20 20:14:09.418419822 +0000 UTC m=+165.288198452" Apr 20 20:14:19.388535 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:19.388505 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gr79r" Apr 20 20:14:22.422947 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:22.422913 2576 generic.go:358] "Generic (PLEG): container finished" podID="cf54aa0b-6d6e-447c-9260-19fd40703cdc" containerID="83c657f4d77eea1cdfa0b40984b6f8bfda3b0ba304886ac4d95a93b58402ae67" exitCode=0 Apr 20 20:14:22.423355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:22.422991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" event={"ID":"cf54aa0b-6d6e-447c-9260-19fd40703cdc","Type":"ContainerDied","Data":"83c657f4d77eea1cdfa0b40984b6f8bfda3b0ba304886ac4d95a93b58402ae67"} Apr 20 20:14:22.423355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:22.423288 2576 scope.go:117] "RemoveContainer" containerID="83c657f4d77eea1cdfa0b40984b6f8bfda3b0ba304886ac4d95a93b58402ae67" Apr 20 20:14:23.427781 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:23.427723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s8wdf" event={"ID":"cf54aa0b-6d6e-447c-9260-19fd40703cdc","Type":"ContainerStarted","Data":"b8126fb4e452dc94ec7e953b671e23b43da3a5325dfdd020cb2ac8af01fc3112"} Apr 20 20:14:27.443203 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:27.443112 2576 generic.go:358] "Generic (PLEG): container finished" podID="238f5c92-7bf3-4286-9225-285c572921b5" containerID="d5e9bb176c3322a267012bb3f19d350311b322378729464ca8a435c577a2d19f" exitCode=0 Apr 20 20:14:27.443656 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:27.443192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" event={"ID":"238f5c92-7bf3-4286-9225-285c572921b5","Type":"ContainerDied","Data":"d5e9bb176c3322a267012bb3f19d350311b322378729464ca8a435c577a2d19f"} Apr 20 20:14:27.443656 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:27.443585 2576 scope.go:117] "RemoveContainer" containerID="d5e9bb176c3322a267012bb3f19d350311b322378729464ca8a435c577a2d19f" Apr 20 20:14:28.447559 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:28.447519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xd58" event={"ID":"238f5c92-7bf3-4286-9225-285c572921b5","Type":"ContainerStarted","Data":"3e2d6dca783b002e9cdbac877a24649896be97cf2ebb68925e55b333d3defdd7"} Apr 20 20:14:35.405836 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.405775 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-fc7d876c-glrb4" podUID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" containerName="console" containerID="cri-o://03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0" gracePeriod=15 Apr 20 20:14:35.661269 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.661219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fc7d876c-glrb4_2f75af20-20e1-4945-8cc8-73adfabcdfa3/console/0.log" Apr 20 20:14:35.661358 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.661286 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:35.782429 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782402 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782442 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782462 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98l9t\" (UniqueName: \"kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782527 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.782846 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.782584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca\") pod \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\" (UID: \"2f75af20-20e1-4945-8cc8-73adfabcdfa3\") " Apr 20 20:14:35.783046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.783015 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:35.783046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.783025 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:35.783174 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.783103 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca" (OuterVolumeSpecName: "service-ca") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:35.783174 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.783132 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config" (OuterVolumeSpecName: "console-config") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:35.784878 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.784850 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:35.784993 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.784900 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:35.784993 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.784948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t" (OuterVolumeSpecName: "kube-api-access-98l9t") pod "2f75af20-20e1-4945-8cc8-73adfabcdfa3" (UID: "2f75af20-20e1-4945-8cc8-73adfabcdfa3"). InnerVolumeSpecName "kube-api-access-98l9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:35.883709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883688 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-serving-cert\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883709 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-trusted-ca-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883916 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883719 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-oauth-serving-cert\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883916 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883728 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-service-ca\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883916 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883757 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-oauth-config\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883916 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883766 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f75af20-20e1-4945-8cc8-73adfabcdfa3-console-config\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:35.883916 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:35.883775 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-98l9t\" (UniqueName: \"kubernetes.io/projected/2f75af20-20e1-4945-8cc8-73adfabcdfa3-kube-api-access-98l9t\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.470528 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470503 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fc7d876c-glrb4_2f75af20-20e1-4945-8cc8-73adfabcdfa3/console/0.log" Apr 20 20:14:36.470927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470541 2576 generic.go:358] "Generic (PLEG): container finished" podID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" containerID="03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0" exitCode=2 Apr 20 20:14:36.470927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7d876c-glrb4" event={"ID":"2f75af20-20e1-4945-8cc8-73adfabcdfa3","Type":"ContainerDied","Data":"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0"} Apr 20 20:14:36.470927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470600 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7d876c-glrb4" Apr 20 20:14:36.470927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7d876c-glrb4" event={"ID":"2f75af20-20e1-4945-8cc8-73adfabcdfa3","Type":"ContainerDied","Data":"40ccc8d74395891f535281ec3e3b8c788c316398788c39533e563c8aa856e2b3"} Apr 20 20:14:36.470927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.470630 2576 scope.go:117] "RemoveContainer" containerID="03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0" Apr 20 20:14:36.479299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.479281 2576 scope.go:117] "RemoveContainer" containerID="03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0" Apr 20 20:14:36.479544 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:14:36.479521 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0\": container with ID starting with 03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0 not found: ID does not exist" containerID="03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0" Apr 20 20:14:36.479607 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.479550 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0"} err="failed to get container status \"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0\": rpc error: code = NotFound desc = could not find container \"03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0\": container with ID starting with 03498852b1faeda84679b2913579ee06f8122cd0dedbe59dca8f38efc7f9abe0 not found: ID does not exist" Apr 20 20:14:36.490073 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.490051 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:14:36.493834 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.493813 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fc7d876c-glrb4"] Apr 20 20:14:36.784874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:14:36.784845 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" path="/var/lib/kubelet/pods/2f75af20-20e1-4945-8cc8-73adfabcdfa3/volumes" Apr 20 20:15:14.655376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:14.655342 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:15:36.036329 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.036300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wjlsd"] Apr 20 20:15:36.036808 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.036640 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" containerName="console" Apr 20 20:15:36.036808 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.036652 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" containerName="console" Apr 20 20:15:36.036808 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.036716 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f75af20-20e1-4945-8cc8-73adfabcdfa3" containerName="console" Apr 20 20:15:36.039453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.039438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.041785 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.041765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:15:36.046231 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.046082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wjlsd"] Apr 20 20:15:36.096299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.096276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-kubelet-config\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.096414 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.096326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a126d196-9791-4180-8ef6-3408f36fa528-original-pull-secret\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.096414 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.096354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-dbus\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.196940 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.196900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a126d196-9791-4180-8ef6-3408f36fa528-original-pull-secret\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.197071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.196960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-dbus\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.197071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.196993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-kubelet-config\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.197144 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.197069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-kubelet-config\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.197177 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.197160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a126d196-9791-4180-8ef6-3408f36fa528-dbus\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.199368 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.199349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a126d196-9791-4180-8ef6-3408f36fa528-original-pull-secret\") pod \"global-pull-secret-syncer-wjlsd\" (UID: \"a126d196-9791-4180-8ef6-3408f36fa528\") " pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.348515 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.348447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wjlsd" Apr 20 20:15:36.465831 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.465802 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wjlsd"] Apr 20 20:15:36.468565 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:15:36.468539 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda126d196_9791_4180_8ef6_3408f36fa528.slice/crio-35c93c8c75c533644f561b0d647896e3d7a5c56d9a0aed548569cec453ab246f WatchSource:0}: Error finding container 35c93c8c75c533644f561b0d647896e3d7a5c56d9a0aed548569cec453ab246f: Status 404 returned error can't find the container with id 35c93c8c75c533644f561b0d647896e3d7a5c56d9a0aed548569cec453ab246f Apr 20 20:15:36.637943 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:36.637881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wjlsd" event={"ID":"a126d196-9791-4180-8ef6-3408f36fa528","Type":"ContainerStarted","Data":"35c93c8c75c533644f561b0d647896e3d7a5c56d9a0aed548569cec453ab246f"} Apr 20 20:15:39.675312 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:39.675265 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c584cd6d7-wbfbq" podUID="30583cad-9ccc-486f-8d3c-0975939e3264" containerName="console" containerID="cri-o://963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6" gracePeriod=15 Apr 20 20:15:41.083368 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.083345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c584cd6d7-wbfbq_30583cad-9ccc-486f-8d3c-0975939e3264/console/0.log" Apr 20 20:15:41.083700 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.083422 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:15:41.138637 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.138607 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjrcn\" (UniqueName: \"kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.138804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.138660 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.138804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.138704 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.138804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.138748 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.138804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.138773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.139118 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config" (OuterVolumeSpecName: "console-config") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:41.139173 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.139224 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139184 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert\") pod \"30583cad-9ccc-486f-8d3c-0975939e3264\" (UID: \"30583cad-9ccc-486f-8d3c-0975939e3264\") " Apr 20 20:15:41.139274 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139226 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca" (OuterVolumeSpecName: "service-ca") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:41.139626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139523 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:41.139626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139605 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-console-config\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.139626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139623 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-service-ca\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.139841 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.139664 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:41.141208 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.141185 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:41.141315 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.141291 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn" (OuterVolumeSpecName: "kube-api-access-mjrcn") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "kube-api-access-mjrcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:41.141571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.141549 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "30583cad-9ccc-486f-8d3c-0975939e3264" (UID: "30583cad-9ccc-486f-8d3c-0975939e3264"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:41.240230 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.240156 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-oauth-config\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.240230 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.240182 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-trusted-ca-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.240230 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.240193 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30583cad-9ccc-486f-8d3c-0975939e3264-oauth-serving-cert\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.240230 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.240202 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjrcn\" (UniqueName: \"kubernetes.io/projected/30583cad-9ccc-486f-8d3c-0975939e3264-kube-api-access-mjrcn\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.240230 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.240210 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30583cad-9ccc-486f-8d3c-0975939e3264-console-serving-cert\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:15:41.654239 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c584cd6d7-wbfbq_30583cad-9ccc-486f-8d3c-0975939e3264/console/0.log" Apr 20 20:15:41.654384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654254 2576 generic.go:358] "Generic (PLEG): container finished" podID="30583cad-9ccc-486f-8d3c-0975939e3264" containerID="963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6" exitCode=2 Apr 20 20:15:41.654384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c584cd6d7-wbfbq" event={"ID":"30583cad-9ccc-486f-8d3c-0975939e3264","Type":"ContainerDied","Data":"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6"} Apr 20 20:15:41.654384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c584cd6d7-wbfbq" Apr 20 20:15:41.654384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c584cd6d7-wbfbq" event={"ID":"30583cad-9ccc-486f-8d3c-0975939e3264","Type":"ContainerDied","Data":"29d575de39054f8c4eb3bd47769e1f3659cf4755e04b68ebabb027a69ee1b80d"} Apr 20 20:15:41.654384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.654354 2576 scope.go:117] "RemoveContainer" containerID="963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6" Apr 20 20:15:41.655972 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.655952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wjlsd" event={"ID":"a126d196-9791-4180-8ef6-3408f36fa528","Type":"ContainerStarted","Data":"547aee70b91e917c915609a406fa02ac29f4fc075aae2356685dfca12ea67351"} Apr 20 20:15:41.662967 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.662945 2576 scope.go:117] "RemoveContainer" containerID="963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6" Apr 20 20:15:41.663202 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:15:41.663181 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6\": container with ID starting with 963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6 not found: ID does not exist" containerID="963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6" Apr 20 20:15:41.663291 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.663209 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6"} err="failed to get container status \"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6\": rpc error: code = NotFound desc = could not find container \"963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6\": container with ID starting with 963e63c669fe6e3db1772d5b4967ef1199091b327f443958728cc432d492a7d6 not found: ID does not exist" Apr 20 20:15:41.673969 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.673933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wjlsd" podStartSLOduration=1.143485353 podStartE2EDuration="5.673921547s" podCreationTimestamp="2026-04-20 20:15:36 +0000 UTC" firstStartedPulling="2026-04-20 20:15:36.470336676 +0000 UTC m=+252.340115285" lastFinishedPulling="2026-04-20 20:15:41.000772862 +0000 UTC m=+256.870551479" observedRunningTime="2026-04-20 20:15:41.672696448 +0000 UTC m=+257.542475091" watchObservedRunningTime="2026-04-20 20:15:41.673921547 +0000 UTC m=+257.543700177" Apr 20 20:15:41.685494 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.685473 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:15:41.687803 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:41.687784 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c584cd6d7-wbfbq"] Apr 20 20:15:42.784541 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:15:42.784507 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30583cad-9ccc-486f-8d3c-0975939e3264" path="/var/lib/kubelet/pods/30583cad-9ccc-486f-8d3c-0975939e3264/volumes" Apr 20 20:16:07.057862 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.057829 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8"] Apr 20 20:16:07.058234 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.058115 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30583cad-9ccc-486f-8d3c-0975939e3264" containerName="console" Apr 20 20:16:07.058234 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.058128 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30583cad-9ccc-486f-8d3c-0975939e3264" containerName="console" Apr 20 20:16:07.058234 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.058192 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30583cad-9ccc-486f-8d3c-0975939e3264" containerName="console" Apr 20 20:16:07.061653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.061627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.064728 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.064711 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:16:07.065763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.065726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:16:07.065859 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.065753 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:16:07.072269 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.072241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8"] Apr 20 20:16:07.112956 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.112927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.113146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.112984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.113146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.113050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwsv\" (UniqueName: \"kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.213480 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.213456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.213576 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.213501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.213576 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.213526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwsv\" (UniqueName: \"kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.213897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.213877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.213970 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.213915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.222546 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.222527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwsv\" (UniqueName: \"kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.371266 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.371219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:07.489287 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.489223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8"] Apr 20 20:16:07.491595 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:07.491556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacff97c4_73fb_450b_a766_3774d64f10f0.slice/crio-abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8 WatchSource:0}: Error finding container abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8: Status 404 returned error can't find the container with id abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8 Apr 20 20:16:07.737305 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:07.737230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerStarted","Data":"abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8"} Apr 20 20:16:12.755148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:12.755075 2576 generic.go:358] "Generic (PLEG): container finished" podID="acff97c4-73fb-450b-a766-3774d64f10f0" containerID="04393a0622f2f91f06343b13d84bad06b39387f45c7b856ab773575f2ff53c56" exitCode=0 Apr 20 20:16:12.755444 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:12.755165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerDied","Data":"04393a0622f2f91f06343b13d84bad06b39387f45c7b856ab773575f2ff53c56"} Apr 20 20:16:14.762031 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:14.761997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerStarted","Data":"5f60183c5e12564dff829db21de0f704b4bfe5b35b49068a10eab09b7aa36364"} Apr 20 20:16:15.766987 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:15.766953 2576 generic.go:358] "Generic (PLEG): container finished" podID="acff97c4-73fb-450b-a766-3774d64f10f0" containerID="5f60183c5e12564dff829db21de0f704b4bfe5b35b49068a10eab09b7aa36364" exitCode=0 Apr 20 20:16:15.767354 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:15.766993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerDied","Data":"5f60183c5e12564dff829db21de0f704b4bfe5b35b49068a10eab09b7aa36364"} Apr 20 20:16:22.790990 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:22.790957 2576 generic.go:358] "Generic (PLEG): container finished" podID="acff97c4-73fb-450b-a766-3774d64f10f0" containerID="f94da2d2925afab0939baf55fc91234caa1a078081c6a540f9f2f1b67dda4cb0" exitCode=0 Apr 20 20:16:22.791355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:22.791036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerDied","Data":"f94da2d2925afab0939baf55fc91234caa1a078081c6a540f9f2f1b67dda4cb0"} Apr 20 20:16:23.912232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:23.912210 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:24.044880 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.044814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbwsv\" (UniqueName: \"kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv\") pod \"acff97c4-73fb-450b-a766-3774d64f10f0\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " Apr 20 20:16:24.044880 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.044852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle\") pod \"acff97c4-73fb-450b-a766-3774d64f10f0\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " Apr 20 20:16:24.045043 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.044893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util\") pod \"acff97c4-73fb-450b-a766-3774d64f10f0\" (UID: \"acff97c4-73fb-450b-a766-3774d64f10f0\") " Apr 20 20:16:24.045561 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.045490 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle" (OuterVolumeSpecName: "bundle") pod "acff97c4-73fb-450b-a766-3774d64f10f0" (UID: "acff97c4-73fb-450b-a766-3774d64f10f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:16:24.047056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.047026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv" (OuterVolumeSpecName: "kube-api-access-tbwsv") pod "acff97c4-73fb-450b-a766-3774d64f10f0" (UID: "acff97c4-73fb-450b-a766-3774d64f10f0"). InnerVolumeSpecName "kube-api-access-tbwsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:16:24.048830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.048809 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util" (OuterVolumeSpecName: "util") pod "acff97c4-73fb-450b-a766-3774d64f10f0" (UID: "acff97c4-73fb-450b-a766-3774d64f10f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:16:24.145611 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.145587 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:24.145611 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.145607 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acff97c4-73fb-450b-a766-3774d64f10f0-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:24.145719 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.145617 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbwsv\" (UniqueName: \"kubernetes.io/projected/acff97c4-73fb-450b-a766-3774d64f10f0-kube-api-access-tbwsv\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:24.670118 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.670091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:16:24.670310 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.670294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:16:24.678715 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.678689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:16:24.679116 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.679097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:16:24.800030 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.797697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" event={"ID":"acff97c4-73fb-450b-a766-3774d64f10f0","Type":"ContainerDied","Data":"abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8"} Apr 20 20:16:24.800030 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.797725 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe02ccbacb2e6abda8dc8804bec0e2f8e34d626fecaad5973427621b76eeeb8" Apr 20 20:16:24.800030 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:24.797789 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8vrh8" Apr 20 20:16:29.958490 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958456 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9"] Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958781 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="pull" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958793 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="pull" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958803 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="extract" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958808 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="extract" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958825 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="util" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958831 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="util" Apr 20 20:16:29.958868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.958871 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="acff97c4-73fb-450b-a766-3774d64f10f0" containerName="extract" Apr 20 20:16:29.961530 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.961514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:29.964036 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.964012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 20:16:29.964172 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.964046 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:16:29.964172 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.964076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-z7pf7\"" Apr 20 20:16:29.973176 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:29.973153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9"] Apr 20 20:16:30.086487 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.086455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6b87e7-8878-4fc2-9efd-b9a905267a10-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.086596 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.086502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59hvp\" (UniqueName: \"kubernetes.io/projected/2e6b87e7-8878-4fc2-9efd-b9a905267a10-kube-api-access-59hvp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.187325 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.187301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6b87e7-8878-4fc2-9efd-b9a905267a10-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.187405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.187340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59hvp\" (UniqueName: \"kubernetes.io/projected/2e6b87e7-8878-4fc2-9efd-b9a905267a10-kube-api-access-59hvp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.187642 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.187623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e6b87e7-8878-4fc2-9efd-b9a905267a10-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.195579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.195562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59hvp\" (UniqueName: \"kubernetes.io/projected/2e6b87e7-8878-4fc2-9efd-b9a905267a10-kube-api-access-59hvp\") pod \"cert-manager-operator-controller-manager-54b9655956-rrbt9\" (UID: \"2e6b87e7-8878-4fc2-9efd-b9a905267a10\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.272064 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.272006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" Apr 20 20:16:30.406850 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.406822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9"] Apr 20 20:16:30.410253 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:30.410224 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6b87e7_8878_4fc2_9efd_b9a905267a10.slice/crio-8e39704f6aed58b2a8319f53740b31fa29c76db38f29a308bc36626b83c360b5 WatchSource:0}: Error finding container 8e39704f6aed58b2a8319f53740b31fa29c76db38f29a308bc36626b83c360b5: Status 404 returned error can't find the container with id 8e39704f6aed58b2a8319f53740b31fa29c76db38f29a308bc36626b83c360b5 Apr 20 20:16:30.412729 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.412712 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:16:30.814310 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:30.814276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" event={"ID":"2e6b87e7-8878-4fc2-9efd-b9a905267a10","Type":"ContainerStarted","Data":"8e39704f6aed58b2a8319f53740b31fa29c76db38f29a308bc36626b83c360b5"} Apr 20 20:16:32.821960 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:32.821923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" event={"ID":"2e6b87e7-8878-4fc2-9efd-b9a905267a10","Type":"ContainerStarted","Data":"397d72908aa8d60295ff4909190085de9bfa5730aa300a638fbe278592e737b1"} Apr 20 20:16:32.844836 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:32.844783 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rrbt9" podStartSLOduration=2.4428934460000002 podStartE2EDuration="3.844769089s" podCreationTimestamp="2026-04-20 20:16:29 +0000 UTC" firstStartedPulling="2026-04-20 20:16:30.412885351 +0000 UTC m=+306.282663959" lastFinishedPulling="2026-04-20 20:16:31.81476099 +0000 UTC m=+307.684539602" observedRunningTime="2026-04-20 20:16:32.842328353 +0000 UTC m=+308.712106984" watchObservedRunningTime="2026-04-20 20:16:32.844769089 +0000 UTC m=+308.714547719" Apr 20 20:16:34.026164 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.026134 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw"] Apr 20 20:16:34.030708 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.030685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.033271 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.033239 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:16:34.033394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.033244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:16:34.034403 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.034379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:16:34.036802 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.036780 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw"] Apr 20 20:16:34.120804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.120769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.120932 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.120830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp76p\" (UniqueName: \"kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.120932 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.120892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.221921 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.221886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.222019 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.221947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.222019 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.221969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp76p\" (UniqueName: \"kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.222249 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.222232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.222302 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.222285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.230972 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.230950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp76p\" (UniqueName: \"kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.341352 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.341330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:34.666009 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.665985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw"] Apr 20 20:16:34.668331 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:34.668300 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54fdea8b_454b_4184_b32b_96470951dc86.slice/crio-7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4 WatchSource:0}: Error finding container 7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4: Status 404 returned error can't find the container with id 7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4 Apr 20 20:16:34.829701 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.829668 2576 generic.go:358] "Generic (PLEG): container finished" podID="54fdea8b-454b-4184-b32b-96470951dc86" containerID="353b35fdf234c3c89da8e7888233a9061d624662702b8976851144627ff063d8" exitCode=0 Apr 20 20:16:34.829839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.829749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" event={"ID":"54fdea8b-454b-4184-b32b-96470951dc86","Type":"ContainerDied","Data":"353b35fdf234c3c89da8e7888233a9061d624662702b8976851144627ff063d8"} Apr 20 20:16:34.829839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:34.829770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" event={"ID":"54fdea8b-454b-4184-b32b-96470951dc86","Type":"ContainerStarted","Data":"7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4"} Apr 20 20:16:36.447353 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.447317 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-dtzns"] Apr 20 20:16:36.451881 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.451861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.454614 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.454592 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 20:16:36.455787 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.455727 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-6m56n\"" Apr 20 20:16:36.455787 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.455765 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 20:16:36.473680 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.473657 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-dtzns"] Apr 20 20:16:36.541063 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.541030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.541191 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.541085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qjw\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-kube-api-access-m5qjw\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.642227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.642198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.642380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.642243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qjw\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-kube-api-access-m5qjw\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.650583 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.650554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qjw\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-kube-api-access-m5qjw\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.650715 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.650615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8a521f8-27f9-4257-83bb-0768d2ca1ddd-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-dtzns\" (UID: \"d8a521f8-27f9-4257-83bb-0768d2ca1ddd\") " pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.761905 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.761830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" Apr 20 20:16:36.932252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:36.932227 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-dtzns"] Apr 20 20:16:36.935015 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:36.934990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a521f8_27f9_4257_83bb_0768d2ca1ddd.slice/crio-36d803d600e08b6a50ce46e29e6cb02708d3f60f9e863bf9f08416c8964e2174 WatchSource:0}: Error finding container 36d803d600e08b6a50ce46e29e6cb02708d3f60f9e863bf9f08416c8964e2174: Status 404 returned error can't find the container with id 36d803d600e08b6a50ce46e29e6cb02708d3f60f9e863bf9f08416c8964e2174 Apr 20 20:16:37.841632 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:37.841594 2576 generic.go:358] "Generic (PLEG): container finished" podID="54fdea8b-454b-4184-b32b-96470951dc86" containerID="3bac8bc2df1fe80fdf92d5de03ace70ba11120b490958a0a4606745464bafe19" exitCode=0 Apr 20 20:16:37.842172 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:37.841683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" event={"ID":"54fdea8b-454b-4184-b32b-96470951dc86","Type":"ContainerDied","Data":"3bac8bc2df1fe80fdf92d5de03ace70ba11120b490958a0a4606745464bafe19"} Apr 20 20:16:37.842935 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:37.842910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" event={"ID":"d8a521f8-27f9-4257-83bb-0768d2ca1ddd","Type":"ContainerStarted","Data":"36d803d600e08b6a50ce46e29e6cb02708d3f60f9e863bf9f08416c8964e2174"} Apr 20 20:16:38.848435 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:38.848397 2576 generic.go:358] "Generic (PLEG): container finished" podID="54fdea8b-454b-4184-b32b-96470951dc86" containerID="a325d789553673471fd2f7705c9e858e68fc2857ce6e025ae71bf0e5ca4f5ebc" exitCode=0 Apr 20 20:16:38.848881 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:38.848480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" event={"ID":"54fdea8b-454b-4184-b32b-96470951dc86","Type":"ContainerDied","Data":"a325d789553673471fd2f7705c9e858e68fc2857ce6e025ae71bf0e5ca4f5ebc"} Apr 20 20:16:39.853478 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:39.853435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" event={"ID":"d8a521f8-27f9-4257-83bb-0768d2ca1ddd","Type":"ContainerStarted","Data":"9d5d817adaf211d3578ed5f682ed3cff025e1d3ca84281ad5f5b084b0b518c69"} Apr 20 20:16:39.868936 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:39.868889 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-dtzns" podStartSLOduration=1.451717141 podStartE2EDuration="3.8688752s" podCreationTimestamp="2026-04-20 20:16:36 +0000 UTC" firstStartedPulling="2026-04-20 20:16:36.936970047 +0000 UTC m=+312.806748655" lastFinishedPulling="2026-04-20 20:16:39.354128104 +0000 UTC m=+315.223906714" observedRunningTime="2026-04-20 20:16:39.866763828 +0000 UTC m=+315.736542459" watchObservedRunningTime="2026-04-20 20:16:39.8688752 +0000 UTC m=+315.738653831" Apr 20 20:16:39.985305 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:39.985284 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:40.067828 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.067801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle\") pod \"54fdea8b-454b-4184-b32b-96470951dc86\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " Apr 20 20:16:40.067949 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.067848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp76p\" (UniqueName: \"kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p\") pod \"54fdea8b-454b-4184-b32b-96470951dc86\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " Apr 20 20:16:40.067949 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.067920 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util\") pod \"54fdea8b-454b-4184-b32b-96470951dc86\" (UID: \"54fdea8b-454b-4184-b32b-96470951dc86\") " Apr 20 20:16:40.068290 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.068265 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle" (OuterVolumeSpecName: "bundle") pod "54fdea8b-454b-4184-b32b-96470951dc86" (UID: "54fdea8b-454b-4184-b32b-96470951dc86"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:16:40.070086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.070058 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p" (OuterVolumeSpecName: "kube-api-access-pp76p") pod "54fdea8b-454b-4184-b32b-96470951dc86" (UID: "54fdea8b-454b-4184-b32b-96470951dc86"). InnerVolumeSpecName "kube-api-access-pp76p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:16:40.073999 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.073968 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util" (OuterVolumeSpecName: "util") pod "54fdea8b-454b-4184-b32b-96470951dc86" (UID: "54fdea8b-454b-4184-b32b-96470951dc86"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:16:40.169294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.169226 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:40.169294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.169259 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54fdea8b-454b-4184-b32b-96470951dc86-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:40.169294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.169274 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp76p\" (UniqueName: \"kubernetes.io/projected/54fdea8b-454b-4184-b32b-96470951dc86-kube-api-access-pp76p\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:16:40.857874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.857847 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" Apr 20 20:16:40.858203 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.857867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdscpw" event={"ID":"54fdea8b-454b-4184-b32b-96470951dc86","Type":"ContainerDied","Data":"7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4"} Apr 20 20:16:40.858203 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:40.857898 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fadf92d4e5be16c3199aa1053d169d8c0d289572ea6ca9cc375b7ba245f25c4" Apr 20 20:16:53.382728 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.382692 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5bsgz"] Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383009 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="pull" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383020 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="pull" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383037 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="extract" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383043 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="extract" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383052 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="util" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383058 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="util" Apr 20 20:16:53.383106 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.383102 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="54fdea8b-454b-4184-b32b-96470951dc86" containerName="extract" Apr 20 20:16:53.388473 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.388457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.390836 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.390816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dkph9\"" Apr 20 20:16:53.395771 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.395710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5bsgz"] Apr 20 20:16:53.564343 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.564305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrt6\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-kube-api-access-hqrt6\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.564478 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.564356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.665539 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.665472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrt6\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-kube-api-access-hqrt6\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.665539 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.665515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.674454 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.674430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.674557 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.674517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrt6\" (UniqueName: \"kubernetes.io/projected/6204ae70-2b0d-489c-9d68-f287d08129a4-kube-api-access-hqrt6\") pod \"cert-manager-79c8d999ff-5bsgz\" (UID: \"6204ae70-2b0d-489c-9d68-f287d08129a4\") " pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.697800 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.697782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5bsgz" Apr 20 20:16:53.813646 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.813625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5bsgz"] Apr 20 20:16:53.816020 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:53.815995 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6204ae70_2b0d_489c_9d68_f287d08129a4.slice/crio-93c442ec1b2f9886a613122b66c08cc3bfb3d0f6a5fd883221f4ba43736105e5 WatchSource:0}: Error finding container 93c442ec1b2f9886a613122b66c08cc3bfb3d0f6a5fd883221f4ba43736105e5: Status 404 returned error can't find the container with id 93c442ec1b2f9886a613122b66c08cc3bfb3d0f6a5fd883221f4ba43736105e5 Apr 20 20:16:53.900892 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.900865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5bsgz" event={"ID":"6204ae70-2b0d-489c-9d68-f287d08129a4","Type":"ContainerStarted","Data":"66714ae87a0ee4c00432d3fcf9a347e618eb67f03af57cb5108d30307954c298"} Apr 20 20:16:53.901030 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.900899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5bsgz" event={"ID":"6204ae70-2b0d-489c-9d68-f287d08129a4","Type":"ContainerStarted","Data":"93c442ec1b2f9886a613122b66c08cc3bfb3d0f6a5fd883221f4ba43736105e5"} Apr 20 20:16:53.916711 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:53.916631 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-5bsgz" podStartSLOduration=0.916619321 podStartE2EDuration="916.619321ms" podCreationTimestamp="2026-04-20 20:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:16:53.915876849 +0000 UTC m=+329.785655480" watchObservedRunningTime="2026-04-20 20:16:53.916619321 +0000 UTC m=+329.786397951" Apr 20 20:16:54.412282 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.412214 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs"] Apr 20 20:16:54.437386 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.437359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs"] Apr 20 20:16:54.437527 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.437491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.440082 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.440058 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:16:54.441200 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.441133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:16:54.441200 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.441133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:16:54.571071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.571039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnhn\" (UniqueName: \"kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.571179 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.571076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.571179 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.571135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.672064 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.672007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnhn\" (UniqueName: \"kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.672064 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.672041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.672252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.672234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.672406 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.672387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.672545 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.672527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.679940 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.679920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnhn\" (UniqueName: \"kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.747685 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.747662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:16:54.867815 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.867789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs"] Apr 20 20:16:54.869636 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:16:54.869607 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081653b0_768a_4fe7_a429_a74680e82b89.slice/crio-a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980 WatchSource:0}: Error finding container a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980: Status 404 returned error can't find the container with id a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980 Apr 20 20:16:54.905282 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:54.905256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" event={"ID":"081653b0-768a-4fe7-a429-a74680e82b89","Type":"ContainerStarted","Data":"a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980"} Apr 20 20:16:55.909424 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:55.909393 2576 generic.go:358] "Generic (PLEG): container finished" podID="081653b0-768a-4fe7-a429-a74680e82b89" containerID="43ae68bf35e5a02a8ef03444c52cdc34a3f8c2c613cc08f9441907ab9aea11ba" exitCode=0 Apr 20 20:16:55.909781 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:55.909466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" event={"ID":"081653b0-768a-4fe7-a429-a74680e82b89","Type":"ContainerDied","Data":"43ae68bf35e5a02a8ef03444c52cdc34a3f8c2c613cc08f9441907ab9aea11ba"} Apr 20 20:16:57.917430 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:57.917399 2576 generic.go:358] "Generic (PLEG): container finished" podID="081653b0-768a-4fe7-a429-a74680e82b89" containerID="9d1e56675e4b0041af3ea68a0a2e5dbe408746c333240735043cd6bf26c71534" exitCode=0 Apr 20 20:16:57.917842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:57.917478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" event={"ID":"081653b0-768a-4fe7-a429-a74680e82b89","Type":"ContainerDied","Data":"9d1e56675e4b0041af3ea68a0a2e5dbe408746c333240735043cd6bf26c71534"} Apr 20 20:16:58.922695 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:58.922657 2576 generic.go:358] "Generic (PLEG): container finished" podID="081653b0-768a-4fe7-a429-a74680e82b89" containerID="e1ab2e21a644cf57049b261faf3c656b2c5bb6ad7ba501c065ef30fbcf393333" exitCode=0 Apr 20 20:16:58.923086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:16:58.922767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" event={"ID":"081653b0-768a-4fe7-a429-a74680e82b89","Type":"ContainerDied","Data":"e1ab2e21a644cf57049b261faf3c656b2c5bb6ad7ba501c065ef30fbcf393333"} Apr 20 20:17:00.043242 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.043219 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:17:00.112191 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.112161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxnhn\" (UniqueName: \"kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn\") pod \"081653b0-768a-4fe7-a429-a74680e82b89\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " Apr 20 20:17:00.112346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.112197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle\") pod \"081653b0-768a-4fe7-a429-a74680e82b89\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " Apr 20 20:17:00.112346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.112229 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util\") pod \"081653b0-768a-4fe7-a429-a74680e82b89\" (UID: \"081653b0-768a-4fe7-a429-a74680e82b89\") " Apr 20 20:17:00.112944 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.112923 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle" (OuterVolumeSpecName: "bundle") pod "081653b0-768a-4fe7-a429-a74680e82b89" (UID: "081653b0-768a-4fe7-a429-a74680e82b89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:00.114378 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.114356 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn" (OuterVolumeSpecName: "kube-api-access-gxnhn") pod "081653b0-768a-4fe7-a429-a74680e82b89" (UID: "081653b0-768a-4fe7-a429-a74680e82b89"). InnerVolumeSpecName "kube-api-access-gxnhn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:00.117505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.117483 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util" (OuterVolumeSpecName: "util") pod "081653b0-768a-4fe7-a429-a74680e82b89" (UID: "081653b0-768a-4fe7-a429-a74680e82b89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:00.213120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.213064 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxnhn\" (UniqueName: \"kubernetes.io/projected/081653b0-768a-4fe7-a429-a74680e82b89-kube-api-access-gxnhn\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:00.213120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.213089 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:00.213120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.213098 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/081653b0-768a-4fe7-a429-a74680e82b89-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:00.930354 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.930322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" event={"ID":"081653b0-768a-4fe7-a429-a74680e82b89","Type":"ContainerDied","Data":"a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980"} Apr 20 20:17:00.930482 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.930367 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fd0a1f38d432903d9619d7e44691a711d94c014307e0f9b1e26582a5791980" Apr 20 20:17:00.930482 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:00.930331 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xc8cs" Apr 20 20:17:09.452238 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt"] Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452492 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="pull" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452504 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="pull" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452517 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="util" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452523 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="util" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452532 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="extract" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452538 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="extract" Apr 20 20:17:09.452670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.452609 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="081653b0-768a-4fe7-a429-a74680e82b89" containerName="extract" Apr 20 20:17:09.457096 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.457079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.459523 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.459498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:17:09.459653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.459533 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:17:09.459653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.459533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:17:09.465355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.465321 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt"] Apr 20 20:17:09.476496 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.476465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.476606 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.476523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhmwn\" (UniqueName: \"kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.476664 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.476613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.577041 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.577019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.577146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.577058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.577146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.577098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhmwn\" (UniqueName: \"kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.577397 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.577382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.577432 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.577416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.585183 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.585159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhmwn\" (UniqueName: \"kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.767081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.767030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:09.896767 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.896714 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt"] Apr 20 20:17:09.898156 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:09.898132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb7c4cc_b27e_4f47_8216_f9a214e7f93e.slice/crio-9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279 WatchSource:0}: Error finding container 9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279: Status 404 returned error can't find the container with id 9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279 Apr 20 20:17:09.958910 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.958875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerStarted","Data":"45de9a9e4c33779a136be880e94cb883bd8264c76da02bd955540f0fc0fe4720"} Apr 20 20:17:09.959006 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:09.958917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerStarted","Data":"9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279"} Apr 20 20:17:10.963853 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:10.963764 2576 generic.go:358] "Generic (PLEG): container finished" podID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerID="45de9a9e4c33779a136be880e94cb883bd8264c76da02bd955540f0fc0fe4720" exitCode=0 Apr 20 20:17:10.963853 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:10.963829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerDied","Data":"45de9a9e4c33779a136be880e94cb883bd8264c76da02bd955540f0fc0fe4720"} Apr 20 20:17:11.026508 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.026473 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd"] Apr 20 20:17:11.028616 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.028589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.031299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.031262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 20:17:11.031436 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.031412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pq7n9\"" Apr 20 20:17:11.031579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.031419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 20:17:11.031579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.031499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 20:17:11.031579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.031540 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 20:17:11.053975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.053952 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd"] Apr 20 20:17:11.087642 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.087614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.087732 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.087666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.087732 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.087689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw8dq\" (UniqueName: \"kubernetes.io/projected/be55442b-c3e3-4dcf-8da3-8653a73e8571-kube-api-access-pw8dq\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.188503 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.188477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.188606 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.188527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.188606 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.188545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw8dq\" (UniqueName: \"kubernetes.io/projected/be55442b-c3e3-4dcf-8da3-8653a73e8571-kube-api-access-pw8dq\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.191056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.191030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.191284 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.191267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be55442b-c3e3-4dcf-8da3-8653a73e8571-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.198532 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.198513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw8dq\" (UniqueName: \"kubernetes.io/projected/be55442b-c3e3-4dcf-8da3-8653a73e8571-kube-api-access-pw8dq\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-hmcsd\" (UID: \"be55442b-c3e3-4dcf-8da3-8653a73e8571\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.339195 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.339172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:11.483086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.483060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd"] Apr 20 20:17:11.485596 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:11.485556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe55442b_c3e3_4dcf_8da3_8653a73e8571.slice/crio-70571831a583f1935f809e817669685cf554744961e9c4ed1818e65385b5e81c WatchSource:0}: Error finding container 70571831a583f1935f809e817669685cf554744961e9c4ed1818e65385b5e81c: Status 404 returned error can't find the container with id 70571831a583f1935f809e817669685cf554744961e9c4ed1818e65385b5e81c Apr 20 20:17:11.968568 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.968493 2576 generic.go:358] "Generic (PLEG): container finished" podID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerID="f27342889c97a0d94ef976e10a805a2c15794fc18e1962ad57f594afc43c1868" exitCode=0 Apr 20 20:17:11.968963 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.968592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerDied","Data":"f27342889c97a0d94ef976e10a805a2c15794fc18e1962ad57f594afc43c1868"} Apr 20 20:17:11.970259 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:11.970234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" event={"ID":"be55442b-c3e3-4dcf-8da3-8653a73e8571","Type":"ContainerStarted","Data":"70571831a583f1935f809e817669685cf554744961e9c4ed1818e65385b5e81c"} Apr 20 20:17:12.976670 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:12.976609 2576 generic.go:358] "Generic (PLEG): container finished" podID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerID="a38eda56f6a4b441a9c5e2fd803f5bc8f2825eedee91902b0646d1999319246a" exitCode=0 Apr 20 20:17:12.977107 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:12.976723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerDied","Data":"a38eda56f6a4b441a9c5e2fd803f5bc8f2825eedee91902b0646d1999319246a"} Apr 20 20:17:14.114239 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.114221 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:14.212243 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.212211 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhmwn\" (UniqueName: \"kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn\") pod \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " Apr 20 20:17:14.212405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.212267 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util\") pod \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " Apr 20 20:17:14.212405 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.212329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle\") pod \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\" (UID: \"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e\") " Apr 20 20:17:14.213231 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.213208 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle" (OuterVolumeSpecName: "bundle") pod "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" (UID: "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:14.214294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.214272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn" (OuterVolumeSpecName: "kube-api-access-qhmwn") pod "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" (UID: "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e"). InnerVolumeSpecName "kube-api-access-qhmwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:14.217470 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.217431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util" (OuterVolumeSpecName: "util") pod "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" (UID: "ccb7c4cc-b27e-4f47-8216-f9a214e7f93e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:14.313342 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.313319 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:14.313342 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.313342 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhmwn\" (UniqueName: \"kubernetes.io/projected/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-kube-api-access-qhmwn\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:14.313483 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.313352 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb7c4cc-b27e-4f47-8216-f9a214e7f93e-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:14.984221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.984184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" event={"ID":"be55442b-c3e3-4dcf-8da3-8653a73e8571","Type":"ContainerStarted","Data":"eccbee0125fb80e2bb5150f8176928d4f471344cbbacbb09242769dcafdf58cc"} Apr 20 20:17:14.984388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.984332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:14.986007 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.985979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" event={"ID":"ccb7c4cc-b27e-4f47-8216-f9a214e7f93e","Type":"ContainerDied","Data":"9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279"} Apr 20 20:17:14.986007 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.986008 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd32fc32763817adb9d5952b773394a4ea04e6600d14c5b11c49aec5d44c279" Apr 20 20:17:14.986168 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:14.986032 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9t7drt" Apr 20 20:17:15.004853 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:15.004811 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" podStartSLOduration=1.4393723999999999 podStartE2EDuration="4.004799976s" podCreationTimestamp="2026-04-20 20:17:11 +0000 UTC" firstStartedPulling="2026-04-20 20:17:11.487421949 +0000 UTC m=+347.357200559" lastFinishedPulling="2026-04-20 20:17:14.052849523 +0000 UTC m=+349.922628135" observedRunningTime="2026-04-20 20:17:15.002559958 +0000 UTC m=+350.872338591" watchObservedRunningTime="2026-04-20 20:17:15.004799976 +0000 UTC m=+350.874578606" Apr 20 20:17:25.991278 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:25.991238 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-hmcsd" Apr 20 20:17:27.965495 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965465 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9"] Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965794 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="extract" Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965805 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="extract" Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965814 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="pull" Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965820 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="pull" Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965828 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="util" Apr 20 20:17:27.965866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965836 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="util" Apr 20 20:17:27.966036 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.965892 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccb7c4cc-b27e-4f47-8216-f9a214e7f93e" containerName="extract" Apr 20 20:17:27.968798 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.968782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:27.971246 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.971229 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:17:27.971356 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.971245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:17:27.972109 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.972091 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:17:27.977327 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:27.977308 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9"] Apr 20 20:17:28.005564 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.005542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmn5\" (UniqueName: \"kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.005652 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.005579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.005692 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.005651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.106275 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.106252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.106364 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.106307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmn5\" (UniqueName: \"kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.106364 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.106347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.106695 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.106676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.106779 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.106674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.114579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.114553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmn5\" (UniqueName: \"kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.278361 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.278297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:28.398456 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.398348 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9"] Apr 20 20:17:28.400499 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:28.400450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73fc3f05_bea5_452a_ae16_a8af8fd93518.slice/crio-08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea WatchSource:0}: Error finding container 08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea: Status 404 returned error can't find the container with id 08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea Apr 20 20:17:28.469450 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.469423 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h"] Apr 20 20:17:28.471761 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.471715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.473992 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.473972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 20:17:28.474100 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.473994 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-dxx7q\"" Apr 20 20:17:28.474100 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.474012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 20:17:28.487568 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.487539 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h"] Apr 20 20:17:28.509054 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.509030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4504c-7479-45ed-936a-ac633867be44-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.509152 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.509077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b4504c-7479-45ed-936a-ac633867be44-tmp\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.509225 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.509207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jbc\" (UniqueName: \"kubernetes.io/projected/c0b4504c-7479-45ed-936a-ac633867be44-kube-api-access-67jbc\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.609817 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.609791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4504c-7479-45ed-936a-ac633867be44-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.609927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.609825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b4504c-7479-45ed-936a-ac633867be44-tmp\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.609927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.609875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67jbc\" (UniqueName: \"kubernetes.io/projected/c0b4504c-7479-45ed-936a-ac633867be44-kube-api-access-67jbc\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.612067 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.612047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0b4504c-7479-45ed-936a-ac633867be44-tmp\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.612212 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.612195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4504c-7479-45ed-936a-ac633867be44-tls-certs\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.617895 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.617878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67jbc\" (UniqueName: \"kubernetes.io/projected/c0b4504c-7479-45ed-936a-ac633867be44-kube-api-access-67jbc\") pod \"kube-auth-proxy-66df9c9b9f-znl5h\" (UID: \"c0b4504c-7479-45ed-936a-ac633867be44\") " pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.781221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.781199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" Apr 20 20:17:28.898650 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:28.898624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h"] Apr 20 20:17:28.900291 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:28.900257 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b4504c_7479_45ed_936a_ac633867be44.slice/crio-23ad50bfb715e0b88c1f51870f11ad6f402183956ac8da678b40adbad98b7599 WatchSource:0}: Error finding container 23ad50bfb715e0b88c1f51870f11ad6f402183956ac8da678b40adbad98b7599: Status 404 returned error can't find the container with id 23ad50bfb715e0b88c1f51870f11ad6f402183956ac8da678b40adbad98b7599 Apr 20 20:17:29.035949 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:29.035915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" event={"ID":"c0b4504c-7479-45ed-936a-ac633867be44","Type":"ContainerStarted","Data":"23ad50bfb715e0b88c1f51870f11ad6f402183956ac8da678b40adbad98b7599"} Apr 20 20:17:29.037181 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:29.037156 2576 generic.go:358] "Generic (PLEG): container finished" podID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerID="de1fa4e5eb388156371b310b625644548d136256055e902ebddc25f25caa8fe0" exitCode=0 Apr 20 20:17:29.037264 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:29.037241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" event={"ID":"73fc3f05-bea5-452a-ae16-a8af8fd93518","Type":"ContainerDied","Data":"de1fa4e5eb388156371b310b625644548d136256055e902ebddc25f25caa8fe0"} Apr 20 20:17:29.037308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:29.037272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" event={"ID":"73fc3f05-bea5-452a-ae16-a8af8fd93518","Type":"ContainerStarted","Data":"08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea"} Apr 20 20:17:30.043763 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:30.043705 2576 generic.go:358] "Generic (PLEG): container finished" podID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerID="b675a7f11ea4e1cabb7a40ebdfb2c2d903869264268bbc296bfa4383cfcd59f0" exitCode=0 Apr 20 20:17:30.044141 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:30.043779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" event={"ID":"73fc3f05-bea5-452a-ae16-a8af8fd93518","Type":"ContainerDied","Data":"b675a7f11ea4e1cabb7a40ebdfb2c2d903869264268bbc296bfa4383cfcd59f0"} Apr 20 20:17:31.053469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.053437 2576 generic.go:358] "Generic (PLEG): container finished" podID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerID="5daebabf9d09341665e6e29cb700e5aecde049d1bb174b0964b1c4eb2f3b4464" exitCode=0 Apr 20 20:17:31.053880 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.053507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" event={"ID":"73fc3f05-bea5-452a-ae16-a8af8fd93518","Type":"ContainerDied","Data":"5daebabf9d09341665e6e29cb700e5aecde049d1bb174b0964b1c4eb2f3b4464"} Apr 20 20:17:31.535389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.535358 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-56p4f"] Apr 20 20:17:31.537557 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.537538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:31.539991 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.539962 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 20:17:31.540128 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.540080 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-hp5k2\"" Apr 20 20:17:31.546399 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.546373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-56p4f"] Apr 20 20:17:31.635000 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.634957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:31.635153 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.635034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbm5\" (UniqueName: \"kubernetes.io/projected/7370a528-f1a2-4465-bc6f-5d7195d5a29f-kube-api-access-wnbm5\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:31.735864 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.735833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:31.735974 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.735897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnbm5\" (UniqueName: \"kubernetes.io/projected/7370a528-f1a2-4465-bc6f-5d7195d5a29f-kube-api-access-wnbm5\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:31.736041 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:17:31.735975 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 20:17:31.736041 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:17:31.736033 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert podName:7370a528-f1a2-4465-bc6f-5d7195d5a29f nodeName:}" failed. No retries permitted until 2026-04-20 20:17:32.236017081 +0000 UTC m=+368.105795690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert") pod "odh-model-controller-858dbf95b8-56p4f" (UID: "7370a528-f1a2-4465-bc6f-5d7195d5a29f") : secret "odh-model-controller-webhook-cert" not found Apr 20 20:17:31.745316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:31.745287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnbm5\" (UniqueName: \"kubernetes.io/projected/7370a528-f1a2-4465-bc6f-5d7195d5a29f-kube-api-access-wnbm5\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:32.240145 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.240114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:32.242950 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.242922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7370a528-f1a2-4465-bc6f-5d7195d5a29f-cert\") pod \"odh-model-controller-858dbf95b8-56p4f\" (UID: \"7370a528-f1a2-4465-bc6f-5d7195d5a29f\") " pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:32.449066 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.448985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:32.666396 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.666371 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:32.745777 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.745752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle\") pod \"73fc3f05-bea5-452a-ae16-a8af8fd93518\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " Apr 20 20:17:32.745897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.745815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util\") pod \"73fc3f05-bea5-452a-ae16-a8af8fd93518\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " Apr 20 20:17:32.745897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.745856 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgmn5\" (UniqueName: \"kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5\") pod \"73fc3f05-bea5-452a-ae16-a8af8fd93518\" (UID: \"73fc3f05-bea5-452a-ae16-a8af8fd93518\") " Apr 20 20:17:32.746812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.746724 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle" (OuterVolumeSpecName: "bundle") pod "73fc3f05-bea5-452a-ae16-a8af8fd93518" (UID: "73fc3f05-bea5-452a-ae16-a8af8fd93518"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:32.748606 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.748555 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5" (OuterVolumeSpecName: "kube-api-access-zgmn5") pod "73fc3f05-bea5-452a-ae16-a8af8fd93518" (UID: "73fc3f05-bea5-452a-ae16-a8af8fd93518"). InnerVolumeSpecName "kube-api-access-zgmn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:32.755507 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.755430 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util" (OuterVolumeSpecName: "util") pod "73fc3f05-bea5-452a-ae16-a8af8fd93518" (UID: "73fc3f05-bea5-452a-ae16-a8af8fd93518"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:32.775426 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.775404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-56p4f"] Apr 20 20:17:32.777861 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:32.777837 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7370a528_f1a2_4465_bc6f_5d7195d5a29f.slice/crio-a391516942c0a2d347649bfd8b7d2046a1cce184c273f6745ccdc33cfd99170b WatchSource:0}: Error finding container a391516942c0a2d347649bfd8b7d2046a1cce184c273f6745ccdc33cfd99170b: Status 404 returned error can't find the container with id a391516942c0a2d347649bfd8b7d2046a1cce184c273f6745ccdc33cfd99170b Apr 20 20:17:32.846355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.846330 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:32.846355 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.846356 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73fc3f05-bea5-452a-ae16-a8af8fd93518-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:32.846469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:32.846366 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgmn5\" (UniqueName: \"kubernetes.io/projected/73fc3f05-bea5-452a-ae16-a8af8fd93518-kube-api-access-zgmn5\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:33.061746 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.061713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" event={"ID":"c0b4504c-7479-45ed-936a-ac633867be44","Type":"ContainerStarted","Data":"d75655d1f5d9cf77c53b1bbbac29f8ba9dcfa1c224a483dd371865c0672e18d7"} Apr 20 20:17:33.063492 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.063473 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" Apr 20 20:17:33.063595 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.063501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hhcj9" event={"ID":"73fc3f05-bea5-452a-ae16-a8af8fd93518","Type":"ContainerDied","Data":"08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea"} Apr 20 20:17:33.063595 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.063534 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08f3acbd21966581bbea6a3725d6c95e550ec4cdbd04cfc68aa49060ed6fd1ea" Apr 20 20:17:33.064658 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.064633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" event={"ID":"7370a528-f1a2-4465-bc6f-5d7195d5a29f","Type":"ContainerStarted","Data":"a391516942c0a2d347649bfd8b7d2046a1cce184c273f6745ccdc33cfd99170b"} Apr 20 20:17:33.077112 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:33.077066 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-66df9c9b9f-znl5h" podStartSLOduration=1.296535101 podStartE2EDuration="5.077051681s" podCreationTimestamp="2026-04-20 20:17:28 +0000 UTC" firstStartedPulling="2026-04-20 20:17:28.902121446 +0000 UTC m=+364.771900054" lastFinishedPulling="2026-04-20 20:17:32.68263802 +0000 UTC m=+368.552416634" observedRunningTime="2026-04-20 20:17:33.076790769 +0000 UTC m=+368.946569401" watchObservedRunningTime="2026-04-20 20:17:33.077051681 +0000 UTC m=+368.946830313" Apr 20 20:17:36.076704 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.076670 2576 generic.go:358] "Generic (PLEG): container finished" podID="7370a528-f1a2-4465-bc6f-5d7195d5a29f" containerID="8f7dba0b8cb1f92aeb9a8ae4b54a3f944d68af10d8952de70c5fe3d50e8f9d9c" exitCode=1 Apr 20 20:17:36.077071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.076714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" event={"ID":"7370a528-f1a2-4465-bc6f-5d7195d5a29f","Type":"ContainerDied","Data":"8f7dba0b8cb1f92aeb9a8ae4b54a3f944d68af10d8952de70c5fe3d50e8f9d9c"} Apr 20 20:17:36.077071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.076920 2576 scope.go:117] "RemoveContainer" containerID="8f7dba0b8cb1f92aeb9a8ae4b54a3f944d68af10d8952de70c5fe3d50e8f9d9c" Apr 20 20:17:36.533022 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.532992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bqld6"] Apr 20 20:17:36.533343 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533330 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="util" Apr 20 20:17:36.533394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533345 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="util" Apr 20 20:17:36.533394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533353 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="extract" Apr 20 20:17:36.533394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533359 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="extract" Apr 20 20:17:36.533394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533370 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="pull" Apr 20 20:17:36.533394 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533375 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="pull" Apr 20 20:17:36.533539 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.533423 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="73fc3f05-bea5-452a-ae16-a8af8fd93518" containerName="extract" Apr 20 20:17:36.535352 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.535335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:36.537888 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.537862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 20:17:36.537991 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.537894 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-gcfr6\"" Apr 20 20:17:36.544367 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.544304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bqld6"] Apr 20 20:17:36.577965 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.577932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:36.578072 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.577968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvdr\" (UniqueName: \"kubernetes.io/projected/adeb5434-be14-4496-94c9-a8d5191d38a0-kube-api-access-jbvdr\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:36.678995 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.678886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:36.678995 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.678934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvdr\" (UniqueName: \"kubernetes.io/projected/adeb5434-be14-4496-94c9-a8d5191d38a0-kube-api-access-jbvdr\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:36.679177 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:17:36.679032 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 20:17:36.679177 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:17:36.679117 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert podName:adeb5434-be14-4496-94c9-a8d5191d38a0 nodeName:}" failed. No retries permitted until 2026-04-20 20:17:37.179097151 +0000 UTC m=+373.048875782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert") pod "kserve-controller-manager-856948b99f-bqld6" (UID: "adeb5434-be14-4496-94c9-a8d5191d38a0") : secret "kserve-webhook-server-cert" not found Apr 20 20:17:36.693682 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:36.693658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvdr\" (UniqueName: \"kubernetes.io/projected/adeb5434-be14-4496-94c9-a8d5191d38a0-kube-api-access-jbvdr\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:37.081883 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.081851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" event={"ID":"7370a528-f1a2-4465-bc6f-5d7195d5a29f","Type":"ContainerStarted","Data":"e9ede4a9cac26c343a93e8d9eb2297badcac60d368d2077f914d9b3320e27fcf"} Apr 20 20:17:37.082326 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.082010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:37.099941 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.099900 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" podStartSLOduration=2.569501518 podStartE2EDuration="6.099888672s" podCreationTimestamp="2026-04-20 20:17:31 +0000 UTC" firstStartedPulling="2026-04-20 20:17:32.779552708 +0000 UTC m=+368.649331318" lastFinishedPulling="2026-04-20 20:17:36.309939862 +0000 UTC m=+372.179718472" observedRunningTime="2026-04-20 20:17:37.098048894 +0000 UTC m=+372.967827552" watchObservedRunningTime="2026-04-20 20:17:37.099888672 +0000 UTC m=+372.969667302" Apr 20 20:17:37.182949 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.182919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:37.185300 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.185274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adeb5434-be14-4496-94c9-a8d5191d38a0-cert\") pod \"kserve-controller-manager-856948b99f-bqld6\" (UID: \"adeb5434-be14-4496-94c9-a8d5191d38a0\") " pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:37.448962 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.448879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:37.784432 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:37.784401 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bqld6"] Apr 20 20:17:37.787694 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:37.787665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadeb5434_be14_4496_94c9_a8d5191d38a0.slice/crio-e781c7f66aa0dd47faed0ef2c2e99b763b24fd26127ba9a0a1ab3405ab1a3002 WatchSource:0}: Error finding container e781c7f66aa0dd47faed0ef2c2e99b763b24fd26127ba9a0a1ab3405ab1a3002: Status 404 returned error can't find the container with id e781c7f66aa0dd47faed0ef2c2e99b763b24fd26127ba9a0a1ab3405ab1a3002 Apr 20 20:17:38.086382 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:38.086311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" event={"ID":"adeb5434-be14-4496-94c9-a8d5191d38a0","Type":"ContainerStarted","Data":"e781c7f66aa0dd47faed0ef2c2e99b763b24fd26127ba9a0a1ab3405ab1a3002"} Apr 20 20:17:41.098375 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:41.098344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" event={"ID":"adeb5434-be14-4496-94c9-a8d5191d38a0","Type":"ContainerStarted","Data":"e0bc803e27a209c6a8e3e6bc88ce0ffde1ec6201abf6cd7ada8a2305fdf61b6c"} Apr 20 20:17:41.098840 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:41.098422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:17:41.125970 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:41.125927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" podStartSLOduration=2.775995976 podStartE2EDuration="5.125912891s" podCreationTimestamp="2026-04-20 20:17:36 +0000 UTC" firstStartedPulling="2026-04-20 20:17:37.788919842 +0000 UTC m=+373.658698451" lastFinishedPulling="2026-04-20 20:17:40.138836743 +0000 UTC m=+376.008615366" observedRunningTime="2026-04-20 20:17:41.123682197 +0000 UTC m=+376.993460827" watchObservedRunningTime="2026-04-20 20:17:41.125912891 +0000 UTC m=+376.995691521" Apr 20 20:17:42.092341 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.092310 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj"] Apr 20 20:17:42.094955 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.094933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.097835 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.097809 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:17:42.098989 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.098932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:17:42.098989 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.098936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x9jb9\"" Apr 20 20:17:42.104454 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.104411 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj"] Apr 20 20:17:42.226216 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.226184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27gz\" (UniqueName: \"kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.226385 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.226228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.226450 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.226405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.327629 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.327595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.327796 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.327642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k27gz\" (UniqueName: \"kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.327796 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.327677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.328046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.328025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.328091 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.328049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.336162 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.336140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27gz\" (UniqueName: \"kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.406716 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.406666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:42.540560 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:42.540536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj"] Apr 20 20:17:42.542695 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:42.542667 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2ed43_1803_4a93_baae_f267d09318a8.slice/crio-04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900 WatchSource:0}: Error finding container 04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900: Status 404 returned error can't find the container with id 04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900 Apr 20 20:17:43.107280 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.107243 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerID="2ee480e70afaf527a1b7af1ce61434db36d578224ce064ffa68cd3edcf81fb02" exitCode=0 Apr 20 20:17:43.107723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.107334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" event={"ID":"f1d2ed43-1803-4a93-baae-f267d09318a8","Type":"ContainerDied","Data":"2ee480e70afaf527a1b7af1ce61434db36d578224ce064ffa68cd3edcf81fb02"} Apr 20 20:17:43.107723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.107373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" event={"ID":"f1d2ed43-1803-4a93-baae-f267d09318a8","Type":"ContainerStarted","Data":"04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900"} Apr 20 20:17:43.335585 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.335554 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5"] Apr 20 20:17:43.338414 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.338399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.341806 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.341778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 20:17:43.341928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.341813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-hswm2\"" Apr 20 20:17:43.341928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.341820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 20:17:43.353360 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.353339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5"] Apr 20 20:17:43.436346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.436283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8pnl\" (UniqueName: \"kubernetes.io/projected/a5135ea4-fc46-416e-9367-e2e23fc794a6-kube-api-access-v8pnl\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.436451 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.436357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a5135ea4-fc46-416e-9367-e2e23fc794a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.537560 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.537533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a5135ea4-fc46-416e-9367-e2e23fc794a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.537678 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.537599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8pnl\" (UniqueName: \"kubernetes.io/projected/a5135ea4-fc46-416e-9367-e2e23fc794a6-kube-api-access-v8pnl\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.540093 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.540066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a5135ea4-fc46-416e-9367-e2e23fc794a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.546989 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.546967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8pnl\" (UniqueName: \"kubernetes.io/projected/a5135ea4-fc46-416e-9367-e2e23fc794a6-kube-api-access-v8pnl\") pod \"servicemesh-operator3-55f49c5f94-cz2r5\" (UID: \"a5135ea4-fc46-416e-9367-e2e23fc794a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.651292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.651272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:43.779020 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:43.778998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5"] Apr 20 20:17:43.780919 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:43.780890 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5135ea4_fc46_416e_9367_e2e23fc794a6.slice/crio-3c28ddf6e3885a9e751fdbe96fb4df3b26e0e4f812358bec9f5d9c841633e955 WatchSource:0}: Error finding container 3c28ddf6e3885a9e751fdbe96fb4df3b26e0e4f812358bec9f5d9c841633e955: Status 404 returned error can't find the container with id 3c28ddf6e3885a9e751fdbe96fb4df3b26e0e4f812358bec9f5d9c841633e955 Apr 20 20:17:44.112272 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:44.112241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" event={"ID":"a5135ea4-fc46-416e-9367-e2e23fc794a6","Type":"ContainerStarted","Data":"3c28ddf6e3885a9e751fdbe96fb4df3b26e0e4f812358bec9f5d9c841633e955"} Apr 20 20:17:47.126778 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:47.126719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" event={"ID":"a5135ea4-fc46-416e-9367-e2e23fc794a6","Type":"ContainerStarted","Data":"ed2c4feff2d6f24708f88987d52cb2cedc2d585e6ddc9742030312a7361c0fb3"} Apr 20 20:17:47.127190 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:47.126791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:47.128412 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:47.128384 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerID="bba9d065c29cede08be4661ff71b8d8b5b230a0e8ef45ceb8b8e5fd47c3a5aca" exitCode=0 Apr 20 20:17:47.128527 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:47.128419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" event={"ID":"f1d2ed43-1803-4a93-baae-f267d09318a8","Type":"ContainerDied","Data":"bba9d065c29cede08be4661ff71b8d8b5b230a0e8ef45ceb8b8e5fd47c3a5aca"} Apr 20 20:17:47.149721 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:47.149679 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" podStartSLOduration=1.783045972 podStartE2EDuration="4.149662537s" podCreationTimestamp="2026-04-20 20:17:43 +0000 UTC" firstStartedPulling="2026-04-20 20:17:43.783349169 +0000 UTC m=+379.653127778" lastFinishedPulling="2026-04-20 20:17:46.149965732 +0000 UTC m=+382.019744343" observedRunningTime="2026-04-20 20:17:47.147927449 +0000 UTC m=+383.017706081" watchObservedRunningTime="2026-04-20 20:17:47.149662537 +0000 UTC m=+383.019441181" Apr 20 20:17:48.088702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:48.088675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-56p4f" Apr 20 20:17:48.134347 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:48.134315 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerID="f26b6b97aac9f40ad897394e2977a23d27152cd98e019889bc0b4c1c219b2e06" exitCode=0 Apr 20 20:17:48.134778 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:48.134437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" event={"ID":"f1d2ed43-1803-4a93-baae-f267d09318a8","Type":"ContainerDied","Data":"f26b6b97aac9f40ad897394e2977a23d27152cd98e019889bc0b4c1c219b2e06"} Apr 20 20:17:49.264165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.264144 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:49.383651 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.383623 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k27gz\" (UniqueName: \"kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz\") pod \"f1d2ed43-1803-4a93-baae-f267d09318a8\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " Apr 20 20:17:49.383814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.383663 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle\") pod \"f1d2ed43-1803-4a93-baae-f267d09318a8\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " Apr 20 20:17:49.383814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.383717 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util\") pod \"f1d2ed43-1803-4a93-baae-f267d09318a8\" (UID: \"f1d2ed43-1803-4a93-baae-f267d09318a8\") " Apr 20 20:17:49.384586 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.384556 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle" (OuterVolumeSpecName: "bundle") pod "f1d2ed43-1803-4a93-baae-f267d09318a8" (UID: "f1d2ed43-1803-4a93-baae-f267d09318a8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:49.385931 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.385901 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz" (OuterVolumeSpecName: "kube-api-access-k27gz") pod "f1d2ed43-1803-4a93-baae-f267d09318a8" (UID: "f1d2ed43-1803-4a93-baae-f267d09318a8"). InnerVolumeSpecName "kube-api-access-k27gz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:49.389877 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.389826 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util" (OuterVolumeSpecName: "util") pod "f1d2ed43-1803-4a93-baae-f267d09318a8" (UID: "f1d2ed43-1803-4a93-baae-f267d09318a8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:49.484494 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.484466 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:49.484494 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.484491 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k27gz\" (UniqueName: \"kubernetes.io/projected/f1d2ed43-1803-4a93-baae-f267d09318a8-kube-api-access-k27gz\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:49.484618 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:49.484509 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1d2ed43-1803-4a93-baae-f267d09318a8-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:17:50.143906 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:50.143881 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" Apr 20 20:17:50.144039 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:50.143902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lm9sj" event={"ID":"f1d2ed43-1803-4a93-baae-f267d09318a8","Type":"ContainerDied","Data":"04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900"} Apr 20 20:17:50.144039 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:50.143930 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f8ec31eb0f943ccc0f1c7caa6b17fd85ef35d9ef57c420875b1db61140f900" Apr 20 20:17:53.527952 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.527922 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb"] Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528243 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="pull" Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528253 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="pull" Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528270 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="util" Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528275 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="util" Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528285 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="extract" Apr 20 20:17:53.528316 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528290 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="extract" Apr 20 20:17:53.528499 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.528335 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d2ed43-1803-4a93-baae-f267d09318a8" containerName="extract" Apr 20 20:17:53.531845 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.531821 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.534340 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.534318 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-lkdgk\"" Apr 20 20:17:53.534473 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.534391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 20:17:53.534473 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.534441 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 20:17:53.534659 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.534641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 20:17:53.534731 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.534716 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 20:17:53.541971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.541950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb"] Apr 20 20:17:53.615643 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615908 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615908 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q9kx\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-kube-api-access-8q9kx\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615908 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d268fd0-1f66-44c4-a826-361d4dc917c0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.615996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.615918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.716959 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.716931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717075 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.716984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q9kx\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-kube-api-access-8q9kx\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717075 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d268fd0-1f66-44c4-a826-361d4dc917c0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717075 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717221 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717278 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717344 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.717793 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.717770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.719522 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.719491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d268fd0-1f66-44c4-a826-361d4dc917c0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.719636 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.719615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.719967 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.719949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.720029 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.719997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d268fd0-1f66-44c4-a826-361d4dc917c0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.726252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.726230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.726493 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.726473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q9kx\" (UniqueName: \"kubernetes.io/projected/4d268fd0-1f66-44c4-a826-361d4dc917c0-kube-api-access-8q9kx\") pod \"istiod-openshift-gateway-55ff986f96-2mmxb\" (UID: \"4d268fd0-1f66-44c4-a826-361d4dc917c0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.843037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.843004 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:53.976762 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:53.976716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb"] Apr 20 20:17:53.978297 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:17:53.978267 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d268fd0_1f66_44c4_a826_361d4dc917c0.slice/crio-81bbebdd1b7743a0e3c8ddd78d31394c83111c4e7093afc9171bce5c659b16ff WatchSource:0}: Error finding container 81bbebdd1b7743a0e3c8ddd78d31394c83111c4e7093afc9171bce5c659b16ff: Status 404 returned error can't find the container with id 81bbebdd1b7743a0e3c8ddd78d31394c83111c4e7093afc9171bce5c659b16ff Apr 20 20:17:54.159255 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:54.159170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" event={"ID":"4d268fd0-1f66-44c4-a826-361d4dc917c0","Type":"ContainerStarted","Data":"81bbebdd1b7743a0e3c8ddd78d31394c83111c4e7093afc9171bce5c659b16ff"} Apr 20 20:17:56.268900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:56.268853 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 20:17:56.269252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:56.268936 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 20:17:57.172706 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:57.172661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" event={"ID":"4d268fd0-1f66-44c4-a826-361d4dc917c0","Type":"ContainerStarted","Data":"7f57ec23186b046de717dab1c8a93910fbf4f40599a29f0d15a0475c9d087d47"} Apr 20 20:17:57.172996 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:57.172973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:17:57.174420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:57.174379 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-2mmxb container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 20:17:57.174524 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:57.174454 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" podUID="4d268fd0-1f66-44c4-a826-361d4dc917c0" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:57.215018 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:57.214967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" podStartSLOduration=1.926639303 podStartE2EDuration="4.214954376s" podCreationTimestamp="2026-04-20 20:17:53 +0000 UTC" firstStartedPulling="2026-04-20 20:17:53.980253044 +0000 UTC m=+389.850031656" lastFinishedPulling="2026-04-20 20:17:56.268568107 +0000 UTC m=+392.138346729" observedRunningTime="2026-04-20 20:17:57.213557346 +0000 UTC m=+393.083335977" watchObservedRunningTime="2026-04-20 20:17:57.214954376 +0000 UTC m=+393.084733008" Apr 20 20:17:58.137263 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:58.137232 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cz2r5" Apr 20 20:17:58.177572 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:17:58.177540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2mmxb" Apr 20 20:18:12.107855 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:12.107825 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-bqld6" Apr 20 20:18:43.577104 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.577016 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f"] Apr 20 20:18:43.579924 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.579901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.582285 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.582264 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 20:18:43.583368 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.583347 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 20:18:43.583483 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.583355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hfnq\"" Apr 20 20:18:43.587642 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.587444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f"] Apr 20 20:18:43.694344 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.694314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.694466 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.694352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.694466 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.694381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvfg\" (UniqueName: \"kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.795199 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.795169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.795321 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.795203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.795321 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.795235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvfg\" (UniqueName: \"kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.795538 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.795515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.795598 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.795545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.803651 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.803624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvfg\" (UniqueName: \"kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:43.890016 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:43.889949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:44.012551 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.012528 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f"] Apr 20 20:18:44.014680 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:18:44.014649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92c179e_7812_4572_8b15_32cab54315cc.slice/crio-f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb WatchSource:0}: Error finding container f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb: Status 404 returned error can't find the container with id f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb Apr 20 20:18:44.177556 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.177498 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k"] Apr 20 20:18:44.179939 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.179923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.187772 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.187722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k"] Apr 20 20:18:44.300897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.300876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.301024 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.300911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.301024 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.300953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwlzd\" (UniqueName: \"kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.346165 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.346136 2576 generic.go:358] "Generic (PLEG): container finished" podID="c92c179e-7812-4572-8b15-32cab54315cc" containerID="8b020ea38bbcfc76dca218137aa200dda42d3ac8562b9bc1cceec6ac4380b15f" exitCode=0 Apr 20 20:18:44.346271 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.346225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" event={"ID":"c92c179e-7812-4572-8b15-32cab54315cc","Type":"ContainerDied","Data":"8b020ea38bbcfc76dca218137aa200dda42d3ac8562b9bc1cceec6ac4380b15f"} Apr 20 20:18:44.346271 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.346261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" event={"ID":"c92c179e-7812-4572-8b15-32cab54315cc","Type":"ContainerStarted","Data":"f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb"} Apr 20 20:18:44.401797 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.401774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.401888 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.401810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.401888 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.401849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwlzd\" (UniqueName: \"kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.402178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.402161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.402216 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.402182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.412957 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.412934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwlzd\" (UniqueName: \"kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.489382 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.489327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:44.574656 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.574622 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc"] Apr 20 20:18:44.579006 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.578984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.583827 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.583793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc"] Apr 20 20:18:44.618751 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.618720 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k"] Apr 20 20:18:44.620146 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:18:44.620126 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb663e802_1dfb_42e4_85a6_7ccdd15d1ede.slice/crio-c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60 WatchSource:0}: Error finding container c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60: Status 404 returned error can't find the container with id c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60 Apr 20 20:18:44.704643 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.704615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.704804 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.704673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.704874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.704797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzk6k\" (UniqueName: \"kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.805814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.805780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.805929 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.805888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzk6k\" (UniqueName: \"kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.805970 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.805949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.806155 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.806138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.806269 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.806249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.813644 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.813623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzk6k\" (UniqueName: \"kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:44.891428 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:44.891406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:45.012294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.012274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc"] Apr 20 20:18:45.117320 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:18:45.117237 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod442dbaa6_f6d0_4f7a_8b7f_76034bdc8733.slice/crio-3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477 WatchSource:0}: Error finding container 3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477: Status 404 returned error can't find the container with id 3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477 Apr 20 20:18:45.178703 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.178677 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl"] Apr 20 20:18:45.181832 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.181814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.190015 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.189992 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl"] Apr 20 20:18:45.309674 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.309649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.309821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.309698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.309821 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.309732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47hvl\" (UniqueName: \"kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.351785 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.351725 2576 generic.go:358] "Generic (PLEG): container finished" podID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerID="68fd91177eac170f5df44537efb5e19eb9f5f181e244f4c026e9e9f37c4f29ff" exitCode=0 Apr 20 20:18:45.351896 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.351836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" event={"ID":"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733","Type":"ContainerDied","Data":"68fd91177eac170f5df44537efb5e19eb9f5f181e244f4c026e9e9f37c4f29ff"} Apr 20 20:18:45.351896 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.351858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" event={"ID":"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733","Type":"ContainerStarted","Data":"3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477"} Apr 20 20:18:45.353383 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.353362 2576 generic.go:358] "Generic (PLEG): container finished" podID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerID="273b8b013bcfee9edbca521ac92e53a1b88d8fada614f9a80f88d713e8ef78b5" exitCode=0 Apr 20 20:18:45.353474 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.353429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" event={"ID":"b663e802-1dfb-42e4-85a6-7ccdd15d1ede","Type":"ContainerDied","Data":"273b8b013bcfee9edbca521ac92e53a1b88d8fada614f9a80f88d713e8ef78b5"} Apr 20 20:18:45.353532 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.353481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" event={"ID":"b663e802-1dfb-42e4-85a6-7ccdd15d1ede","Type":"ContainerStarted","Data":"c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60"} Apr 20 20:18:45.355408 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.355366 2576 generic.go:358] "Generic (PLEG): container finished" podID="c92c179e-7812-4572-8b15-32cab54315cc" containerID="bf441d04c34a325fbe2f475c1d991157a6b2760bf31ccc49647df4a54fb9eb1b" exitCode=0 Apr 20 20:18:45.355489 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.355458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" event={"ID":"c92c179e-7812-4572-8b15-32cab54315cc","Type":"ContainerDied","Data":"bf441d04c34a325fbe2f475c1d991157a6b2760bf31ccc49647df4a54fb9eb1b"} Apr 20 20:18:45.410982 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.410961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.411094 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.411017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47hvl\" (UniqueName: \"kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.411164 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.411091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.411368 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.411344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.411443 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.411379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.418250 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.418226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47hvl\" (UniqueName: \"kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.492680 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.492647 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:45.615059 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:45.615030 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl"] Apr 20 20:18:45.617444 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:18:45.617414 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36d5a16_a21c_4243_b5bd_c92a268d4972.slice/crio-ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf WatchSource:0}: Error finding container ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf: Status 404 returned error can't find the container with id ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf Apr 20 20:18:45.738719 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:18:45.738698 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36d5a16_a21c_4243_b5bd_c92a268d4972.slice/crio-44680675604ed90658ff8893873ea3f38f1257ff03a7dd05263fba376d1680bf.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:18:46.361299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.361231 2576 generic.go:358] "Generic (PLEG): container finished" podID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerID="2c7c0bc4b6228fdd1861d6131cae489973a4e98233144be8057e2a5e43f89999" exitCode=0 Apr 20 20:18:46.361385 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.361300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" event={"ID":"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733","Type":"ContainerDied","Data":"2c7c0bc4b6228fdd1861d6131cae489973a4e98233144be8057e2a5e43f89999"} Apr 20 20:18:46.362893 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.362872 2576 generic.go:358] "Generic (PLEG): container finished" podID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerID="73c452bfffc93ae24b31af0e61f1c3eed781738f4fc50cba2550fa894cdbc318" exitCode=0 Apr 20 20:18:46.363000 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.362955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" event={"ID":"b663e802-1dfb-42e4-85a6-7ccdd15d1ede","Type":"ContainerDied","Data":"73c452bfffc93ae24b31af0e61f1c3eed781738f4fc50cba2550fa894cdbc318"} Apr 20 20:18:46.364331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.364308 2576 generic.go:358] "Generic (PLEG): container finished" podID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerID="44680675604ed90658ff8893873ea3f38f1257ff03a7dd05263fba376d1680bf" exitCode=0 Apr 20 20:18:46.364412 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.364331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" event={"ID":"d36d5a16-a21c-4243-b5bd-c92a268d4972","Type":"ContainerDied","Data":"44680675604ed90658ff8893873ea3f38f1257ff03a7dd05263fba376d1680bf"} Apr 20 20:18:46.364412 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.364357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" event={"ID":"d36d5a16-a21c-4243-b5bd-c92a268d4972","Type":"ContainerStarted","Data":"ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf"} Apr 20 20:18:46.366511 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.366491 2576 generic.go:358] "Generic (PLEG): container finished" podID="c92c179e-7812-4572-8b15-32cab54315cc" containerID="b689b107dbed8eb99264933949ce242e3a9115b814aac6c191dc2d4482bcff6b" exitCode=0 Apr 20 20:18:46.366616 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:46.366550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" event={"ID":"c92c179e-7812-4572-8b15-32cab54315cc","Type":"ContainerDied","Data":"b689b107dbed8eb99264933949ce242e3a9115b814aac6c191dc2d4482bcff6b"} Apr 20 20:18:47.371862 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.371787 2576 generic.go:358] "Generic (PLEG): container finished" podID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerID="913e159ea5f5081f334a7859d4d7ff29d0a69a81ab6c6f161062d71008b8ab6c" exitCode=0 Apr 20 20:18:47.372190 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.371871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" event={"ID":"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733","Type":"ContainerDied","Data":"913e159ea5f5081f334a7859d4d7ff29d0a69a81ab6c6f161062d71008b8ab6c"} Apr 20 20:18:47.373621 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.373602 2576 generic.go:358] "Generic (PLEG): container finished" podID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerID="6add2d7bf4daec4bdc7f6d74ae9b225366d98c6abf1cfcb2734641d6673b03b8" exitCode=0 Apr 20 20:18:47.373717 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.373668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" event={"ID":"b663e802-1dfb-42e4-85a6-7ccdd15d1ede","Type":"ContainerDied","Data":"6add2d7bf4daec4bdc7f6d74ae9b225366d98c6abf1cfcb2734641d6673b03b8"} Apr 20 20:18:47.375056 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.375032 2576 generic.go:358] "Generic (PLEG): container finished" podID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerID="c6b3508eb40b2ba7b45228e08e65a30242b763d5311aa51b1ebbb5a628944d79" exitCode=0 Apr 20 20:18:47.375175 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.375155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" event={"ID":"d36d5a16-a21c-4243-b5bd-c92a268d4972","Type":"ContainerDied","Data":"c6b3508eb40b2ba7b45228e08e65a30242b763d5311aa51b1ebbb5a628944d79"} Apr 20 20:18:47.497514 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.497495 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:47.630210 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.630147 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle\") pod \"c92c179e-7812-4572-8b15-32cab54315cc\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " Apr 20 20:18:47.630210 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.630204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util\") pod \"c92c179e-7812-4572-8b15-32cab54315cc\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " Apr 20 20:18:47.630389 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.630248 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cvfg\" (UniqueName: \"kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg\") pod \"c92c179e-7812-4572-8b15-32cab54315cc\" (UID: \"c92c179e-7812-4572-8b15-32cab54315cc\") " Apr 20 20:18:47.630638 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.630614 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle" (OuterVolumeSpecName: "bundle") pod "c92c179e-7812-4572-8b15-32cab54315cc" (UID: "c92c179e-7812-4572-8b15-32cab54315cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:47.632480 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.632457 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg" (OuterVolumeSpecName: "kube-api-access-2cvfg") pod "c92c179e-7812-4572-8b15-32cab54315cc" (UID: "c92c179e-7812-4572-8b15-32cab54315cc"). InnerVolumeSpecName "kube-api-access-2cvfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:47.635143 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.635114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util" (OuterVolumeSpecName: "util") pod "c92c179e-7812-4572-8b15-32cab54315cc" (UID: "c92c179e-7812-4572-8b15-32cab54315cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:47.730863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.730837 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:47.730863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.730857 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cvfg\" (UniqueName: \"kubernetes.io/projected/c92c179e-7812-4572-8b15-32cab54315cc-kube-api-access-2cvfg\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:47.730863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:47.730866 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92c179e-7812-4572-8b15-32cab54315cc-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.380408 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.380379 2576 generic.go:358] "Generic (PLEG): container finished" podID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerID="0b471b2b49192f84bfe708bc6bd1b55af5b04e63fff752e3bd583af64124a55c" exitCode=0 Apr 20 20:18:48.380787 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.380464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" event={"ID":"d36d5a16-a21c-4243-b5bd-c92a268d4972","Type":"ContainerDied","Data":"0b471b2b49192f84bfe708bc6bd1b55af5b04e63fff752e3bd583af64124a55c"} Apr 20 20:18:48.382077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.382059 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" Apr 20 20:18:48.382168 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.382071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f" event={"ID":"c92c179e-7812-4572-8b15-32cab54315cc","Type":"ContainerDied","Data":"f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb"} Apr 20 20:18:48.382168 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.382098 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f492586a7a4e54de3700e96bf06aeee70fe1418f80af2f21fb2ee0549d13b3cb" Apr 20 20:18:48.525660 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.525641 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:48.551295 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.551269 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:48.638274 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638208 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util\") pod \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " Apr 20 20:18:48.638274 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638235 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util\") pod \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " Apr 20 20:18:48.638274 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638256 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwlzd\" (UniqueName: \"kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd\") pod \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " Apr 20 20:18:48.638497 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638275 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle\") pod \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " Apr 20 20:18:48.638497 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzk6k\" (UniqueName: \"kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k\") pod \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\" (UID: \"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733\") " Apr 20 20:18:48.638497 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638351 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle\") pod \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\" (UID: \"b663e802-1dfb-42e4-85a6-7ccdd15d1ede\") " Apr 20 20:18:48.639007 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.638976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle" (OuterVolumeSpecName: "bundle") pod "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" (UID: "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:48.639258 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.639226 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle" (OuterVolumeSpecName: "bundle") pod "b663e802-1dfb-42e4-85a6-7ccdd15d1ede" (UID: "b663e802-1dfb-42e4-85a6-7ccdd15d1ede"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:48.640653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.640592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd" (OuterVolumeSpecName: "kube-api-access-lwlzd") pod "b663e802-1dfb-42e4-85a6-7ccdd15d1ede" (UID: "b663e802-1dfb-42e4-85a6-7ccdd15d1ede"). InnerVolumeSpecName "kube-api-access-lwlzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:48.640724 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.640619 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k" (OuterVolumeSpecName: "kube-api-access-mzk6k") pod "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" (UID: "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733"). InnerVolumeSpecName "kube-api-access-mzk6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:48.646560 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.646535 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util" (OuterVolumeSpecName: "util") pod "b663e802-1dfb-42e4-85a6-7ccdd15d1ede" (UID: "b663e802-1dfb-42e4-85a6-7ccdd15d1ede"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:48.647138 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.647121 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util" (OuterVolumeSpecName: "util") pod "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" (UID: "442dbaa6-f6d0-4f7a-8b7f-76034bdc8733"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:48.739619 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739595 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.739619 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739616 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.739709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739625 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.739709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739635 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwlzd\" (UniqueName: \"kubernetes.io/projected/b663e802-1dfb-42e4-85a6-7ccdd15d1ede-kube-api-access-lwlzd\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.739709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739644 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:48.739709 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:48.739652 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzk6k\" (UniqueName: \"kubernetes.io/projected/442dbaa6-f6d0-4f7a-8b7f-76034bdc8733-kube-api-access-mzk6k\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:49.387977 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.387940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" event={"ID":"442dbaa6-f6d0-4f7a-8b7f-76034bdc8733","Type":"ContainerDied","Data":"3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477"} Apr 20 20:18:49.387977 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.387982 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c36fb12d60b7bb849c992cf1591ece2d6e9bb69166722790ab9b2167a336477" Apr 20 20:18:49.388438 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.387964 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc" Apr 20 20:18:49.389874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.389853 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" Apr 20 20:18:49.389874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.389859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k" event={"ID":"b663e802-1dfb-42e4-85a6-7ccdd15d1ede","Type":"ContainerDied","Data":"c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60"} Apr 20 20:18:49.390068 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.389888 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f55f1b75e03bf6d6c7799a817a9b8b4d87703a5c4ade545124ec7220ec8b60" Apr 20 20:18:49.517812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.517791 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:18:49.646093 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.646009 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle\") pod \"d36d5a16-a21c-4243-b5bd-c92a268d4972\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " Apr 20 20:18:49.646093 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.646086 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47hvl\" (UniqueName: \"kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl\") pod \"d36d5a16-a21c-4243-b5bd-c92a268d4972\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " Apr 20 20:18:49.646307 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.646113 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util\") pod \"d36d5a16-a21c-4243-b5bd-c92a268d4972\" (UID: \"d36d5a16-a21c-4243-b5bd-c92a268d4972\") " Apr 20 20:18:49.646519 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.646495 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle" (OuterVolumeSpecName: "bundle") pod "d36d5a16-a21c-4243-b5bd-c92a268d4972" (UID: "d36d5a16-a21c-4243-b5bd-c92a268d4972"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:49.648373 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.648347 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl" (OuterVolumeSpecName: "kube-api-access-47hvl") pod "d36d5a16-a21c-4243-b5bd-c92a268d4972" (UID: "d36d5a16-a21c-4243-b5bd-c92a268d4972"). InnerVolumeSpecName "kube-api-access-47hvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:49.651241 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.651222 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util" (OuterVolumeSpecName: "util") pod "d36d5a16-a21c-4243-b5bd-c92a268d4972" (UID: "d36d5a16-a21c-4243-b5bd-c92a268d4972"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:49.747317 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.747293 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-bundle\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:49.747317 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.747317 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47hvl\" (UniqueName: \"kubernetes.io/projected/d36d5a16-a21c-4243-b5bd-c92a268d4972-kube-api-access-47hvl\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:49.747449 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:49.747326 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d36d5a16-a21c-4243-b5bd-c92a268d4972-util\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:18:50.395153 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:50.395122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" event={"ID":"d36d5a16-a21c-4243-b5bd-c92a268d4972","Type":"ContainerDied","Data":"ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf"} Apr 20 20:18:50.395153 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:50.395151 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7f52b8d3db90262bcf669f7c8ceec253916861c8e36593210be7ae4c2ffbbf" Apr 20 20:18:50.395704 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:18:50.395180 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl" Apr 20 20:19:01.690766 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.690668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76d5b6554f-6cnkf"] Apr 20 20:19:01.691196 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691173 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="extract" Apr 20 20:19:01.691237 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691203 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="extract" Apr 20 20:19:01.691237 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691219 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="util" Apr 20 20:19:01.691237 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691227 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="util" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691238 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="pull" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691247 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="pull" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691262 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="pull" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691270 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="pull" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691282 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="extract" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691290 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="extract" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691306 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="util" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691316 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="util" Apr 20 20:19:01.691331 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691328 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691336 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691342 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691347 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691356 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="util" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691363 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="util" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691371 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="pull" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691376 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="pull" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691383 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="pull" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691387 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="pull" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691396 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="util" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691400 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="util" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691464 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b663e802-1dfb-42e4-85a6-7ccdd15d1ede" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691475 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d36d5a16-a21c-4243-b5bd-c92a268d4972" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691485 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="442dbaa6-f6d0-4f7a-8b7f-76034bdc8733" containerName="extract" Apr 20 20:19:01.691571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.691495 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c92c179e-7812-4572-8b15-32cab54315cc" containerName="extract" Apr 20 20:19:01.694840 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.694818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.699856 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.699288 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:19:01.699856 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.699505 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:19:01.699856 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.699757 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:19:01.700130 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.699975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m57vs\"" Apr 20 20:19:01.700295 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.700268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:19:01.700503 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.700487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:19:01.704757 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.704530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:19:01.705657 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.705632 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5b6554f-6cnkf"] Apr 20 20:19:01.832985 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.832956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-service-ca\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-oauth-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-trusted-ca-bundle\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833146 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww65\" (UniqueName: \"kubernetes.io/projected/53b7bb3c-379a-458b-bbec-0c2d75a804bd-kube-api-access-kww65\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.833323 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.833163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-oauth-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934217 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-oauth-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-trusted-ca-bundle\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934403 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kww65\" (UniqueName: \"kubernetes.io/projected/53b7bb3c-379a-458b-bbec-0c2d75a804bd-kube-api-access-kww65\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934403 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-oauth-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934501 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934501 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.934597 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.934555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-service-ca\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.935133 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.935110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.935232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.935175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-trusted-ca-bundle\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.935232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.935186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-service-ca\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.935232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.935175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53b7bb3c-379a-458b-bbec-0c2d75a804bd-oauth-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.937018 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.936995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-serving-cert\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.937098 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.936995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53b7bb3c-379a-458b-bbec-0c2d75a804bd-console-oauth-config\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:01.943663 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:01.943611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww65\" (UniqueName: \"kubernetes.io/projected/53b7bb3c-379a-458b-bbec-0c2d75a804bd-kube-api-access-kww65\") pod \"console-76d5b6554f-6cnkf\" (UID: \"53b7bb3c-379a-458b-bbec-0c2d75a804bd\") " pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:02.008545 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:02.008522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:02.138345 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:02.138323 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5b6554f-6cnkf"] Apr 20 20:19:02.140199 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:02.140171 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b7bb3c_379a_458b_bbec_0c2d75a804bd.slice/crio-74c49e14caa77923ba2307407f6687ea6d8a67b4f83e52b566c8d67521e491ef WatchSource:0}: Error finding container 74c49e14caa77923ba2307407f6687ea6d8a67b4f83e52b566c8d67521e491ef: Status 404 returned error can't find the container with id 74c49e14caa77923ba2307407f6687ea6d8a67b4f83e52b566c8d67521e491ef Apr 20 20:19:02.441278 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:02.441235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5b6554f-6cnkf" event={"ID":"53b7bb3c-379a-458b-bbec-0c2d75a804bd","Type":"ContainerStarted","Data":"e6cb03a220f5b41bd602aa36c0662232cea965139aef5a7d0954d8aa5c21d39d"} Apr 20 20:19:02.441485 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:02.441468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5b6554f-6cnkf" event={"ID":"53b7bb3c-379a-458b-bbec-0c2d75a804bd","Type":"ContainerStarted","Data":"74c49e14caa77923ba2307407f6687ea6d8a67b4f83e52b566c8d67521e491ef"} Apr 20 20:19:02.468308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:02.468217 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76d5b6554f-6cnkf" podStartSLOduration=1.46820347 podStartE2EDuration="1.46820347s" podCreationTimestamp="2026-04-20 20:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:02.465413252 +0000 UTC m=+458.335191881" watchObservedRunningTime="2026-04-20 20:19:02.46820347 +0000 UTC m=+458.337982101" Apr 20 20:19:04.893186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.893150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz"] Apr 20 20:19:04.895942 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.895918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:04.900985 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.900964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-552kj\"" Apr 20 20:19:04.901178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.901124 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 20:19:04.901677 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.901659 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 20:19:04.914574 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.914549 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz"] Apr 20 20:19:04.959728 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.959700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:04.959863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:04.959768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2f5l\" (UniqueName: \"kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.060940 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.060915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.061037 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.060951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2f5l\" (UniqueName: \"kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.061322 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.061300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.081046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.081024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2f5l\" (UniqueName: \"kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.206856 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.206804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:05.341462 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.341438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz"] Apr 20 20:19:05.343796 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:05.343766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fd4ff6_b611_4c7d_9672_9e1f58d50432.slice/crio-e0763a27732bec6266f1c5f103e5fc0cf842560c333e0bd0fb826e54e0b9b270 WatchSource:0}: Error finding container e0763a27732bec6266f1c5f103e5fc0cf842560c333e0bd0fb826e54e0b9b270: Status 404 returned error can't find the container with id e0763a27732bec6266f1c5f103e5fc0cf842560c333e0bd0fb826e54e0b9b270 Apr 20 20:19:05.453760 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:05.453721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" event={"ID":"01fd4ff6-b611-4c7d-9672-9e1f58d50432","Type":"ContainerStarted","Data":"e0763a27732bec6266f1c5f103e5fc0cf842560c333e0bd0fb826e54e0b9b270"} Apr 20 20:19:10.985137 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:10.985104 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97"] Apr 20 20:19:10.988690 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:10.988667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:10.991266 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:10.991219 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 20:19:10.991390 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:10.991291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-ds6ff\"" Apr 20 20:19:10.997174 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:10.997058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97"] Apr 20 20:19:11.112104 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:11.112069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdgn\" (UniqueName: \"kubernetes.io/projected/9f9c407d-4549-4dca-8ef1-7fabaf3b452b-kube-api-access-rgdgn\") pod \"dns-operator-controller-manager-648d5c98bc-2fn97\" (UID: \"9f9c407d-4549-4dca-8ef1-7fabaf3b452b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:11.212685 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:11.212641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdgn\" (UniqueName: \"kubernetes.io/projected/9f9c407d-4549-4dca-8ef1-7fabaf3b452b-kube-api-access-rgdgn\") pod \"dns-operator-controller-manager-648d5c98bc-2fn97\" (UID: \"9f9c407d-4549-4dca-8ef1-7fabaf3b452b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:11.223874 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:11.223843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdgn\" (UniqueName: \"kubernetes.io/projected/9f9c407d-4549-4dca-8ef1-7fabaf3b452b-kube-api-access-rgdgn\") pod \"dns-operator-controller-manager-648d5c98bc-2fn97\" (UID: \"9f9c407d-4549-4dca-8ef1-7fabaf3b452b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:11.302663 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:11.302628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:11.900658 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:11.900637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97"] Apr 20 20:19:11.902727 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:11.902686 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f9c407d_4549_4dca_8ef1_7fabaf3b452b.slice/crio-e2ab755721babdd92f6d20f5479c04fab4da40f5a95f88ef485e60bbea6f264b WatchSource:0}: Error finding container e2ab755721babdd92f6d20f5479c04fab4da40f5a95f88ef485e60bbea6f264b: Status 404 returned error can't find the container with id e2ab755721babdd92f6d20f5479c04fab4da40f5a95f88ef485e60bbea6f264b Apr 20 20:19:12.009444 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.009402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:12.009875 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.009464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:12.014549 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.014528 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:12.484193 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.484150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" event={"ID":"01fd4ff6-b611-4c7d-9672-9e1f58d50432","Type":"ContainerStarted","Data":"542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398"} Apr 20 20:19:12.484376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.484287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:12.485353 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.485324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" event={"ID":"9f9c407d-4549-4dca-8ef1-7fabaf3b452b","Type":"ContainerStarted","Data":"e2ab755721babdd92f6d20f5479c04fab4da40f5a95f88ef485e60bbea6f264b"} Apr 20 20:19:12.489228 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.489210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76d5b6554f-6cnkf" Apr 20 20:19:12.505913 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:12.505866 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" podStartSLOduration=2.013911479 podStartE2EDuration="8.505850906s" podCreationTimestamp="2026-04-20 20:19:04 +0000 UTC" firstStartedPulling="2026-04-20 20:19:05.346406297 +0000 UTC m=+461.216184913" lastFinishedPulling="2026-04-20 20:19:11.838345732 +0000 UTC m=+467.708124340" observedRunningTime="2026-04-20 20:19:12.503880155 +0000 UTC m=+468.373658793" watchObservedRunningTime="2026-04-20 20:19:12.505850906 +0000 UTC m=+468.375629537" Apr 20 20:19:15.497932 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:15.497894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" event={"ID":"9f9c407d-4549-4dca-8ef1-7fabaf3b452b","Type":"ContainerStarted","Data":"0d644851a2e6844d325977543acfb1b43aa604797619c4871d38b899d8de1733"} Apr 20 20:19:15.498312 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:15.497959 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:15.517443 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:15.517397 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" podStartSLOduration=2.7457227829999997 podStartE2EDuration="5.51738315s" podCreationTimestamp="2026-04-20 20:19:10 +0000 UTC" firstStartedPulling="2026-04-20 20:19:11.904786049 +0000 UTC m=+467.774564658" lastFinishedPulling="2026-04-20 20:19:14.676446409 +0000 UTC m=+470.546225025" observedRunningTime="2026-04-20 20:19:15.514521974 +0000 UTC m=+471.384300605" watchObservedRunningTime="2026-04-20 20:19:15.51738315 +0000 UTC m=+471.387161780" Apr 20 20:19:23.490612 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:23.490578 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:24.576593 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.576562 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk"] Apr 20 20:19:24.581868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.581847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.593050 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.593027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk"] Apr 20 20:19:24.715281 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.715255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.715421 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.715293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrbd\" (UniqueName: \"kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.815775 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.815720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.815910 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.815799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clrbd\" (UniqueName: \"kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.816143 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.816114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.834667 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.834600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrbd\" (UniqueName: \"kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:24.893352 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:24.893327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:25.195060 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.194982 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz"] Apr 20 20:19:25.195247 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.195224 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" containerName="manager" containerID="cri-o://542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398" gracePeriod=2 Apr 20 20:19:25.208452 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.208423 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz"] Apr 20 20:19:25.220984 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.220959 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk"] Apr 20 20:19:25.224687 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.224665 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:25.225214 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.225195 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" containerName="manager" Apr 20 20:19:25.225290 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.225219 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" containerName="manager" Apr 20 20:19:25.225346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.225322 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" containerName="manager" Apr 20 20:19:25.228852 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.228834 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk"] Apr 20 20:19:25.228972 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.228934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.231188 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.231160 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.236195 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.236171 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:25.248536 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.248517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:25.252392 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.252377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.265910 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.265874 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.270984 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.270958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:25.320436 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.320403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.320562 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.320505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prc8w\" (UniqueName: \"kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:19:25.369283 2576 log.go:32] "RunPodSandbox from runtime service failed" err=< Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kuadrant-operator-controller-manager-6bc9f4c76f-r5smk_kuadrant-system_061697ba-e757-4a93-8595-f749da70300d_0(7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e): error adding pod kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-r5smk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e" Netns:"/var/run/netns/cf56f222-8f58-4b06-90f3-c99e15dcac3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=kuadrant-system;K8S_POD_NAME=kuadrant-operator-controller-manager-6bc9f4c76f-r5smk;K8S_POD_INFRA_CONTAINER_ID=7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e;K8S_POD_UID=061697ba-e757-4a93-8595-f749da70300d" Path:"" ERRORED: error configuring pod [kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk] networking: Multus: [kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk/061697ba-e757-4a93-8595-f749da70300d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod kuadrant-operator-controller-manager-6bc9f4c76f-r5smk in out of cluster comm: SetNetworkStatus: failed to update the pod kuadrant-operator-controller-manager-6bc9f4c76f-r5smk in out of cluster comm: status update failed for pod /: pods "kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" not found Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: ': StdinData: {"auxiliaryCNIChainName":"vendor-cni-chain","binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: > Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:19:25.369358 2576 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err=< Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kuadrant-operator-controller-manager-6bc9f4c76f-r5smk_kuadrant-system_061697ba-e757-4a93-8595-f749da70300d_0(7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e): error adding pod kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-r5smk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e" Netns:"/var/run/netns/cf56f222-8f58-4b06-90f3-c99e15dcac3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=kuadrant-system;K8S_POD_NAME=kuadrant-operator-controller-manager-6bc9f4c76f-r5smk;K8S_POD_INFRA_CONTAINER_ID=7157885016a95e65d58e2777d0937973640dc3c30f98eb670da1b708fd35ba1e;K8S_POD_UID=061697ba-e757-4a93-8595-f749da70300d" Path:"" ERRORED: error configuring pod [kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk] networking: Multus: [kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk/061697ba-e757-4a93-8595-f749da70300d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod kuadrant-operator-controller-manager-6bc9f4c76f-r5smk in out of cluster comm: SetNetworkStatus: failed to update the pod kuadrant-operator-controller-manager-6bc9f4c76f-r5smk in out of cluster comm: status update failed for pod /: pods "kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" not found Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: ': StdinData: {"auxiliaryCNIChainName":"vendor-cni-chain","binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 20 20:19:25.369452 ip-10-0-129-247 kubenswrapper[2576]: > pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:25.421016 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.420991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.421108 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.421032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prc8w\" (UniqueName: \"kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.421178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.421161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.421262 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.421222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2wt\" (UniqueName: \"kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.421553 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.421533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.425810 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.425729 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:25.428466 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.428442 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.435927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.435899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prc8w\" (UniqueName: \"kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-hh8lb\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.522141 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume\") pod \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " Apr 20 20:19:25.522141 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2f5l\" (UniqueName: \"kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l\") pod \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\" (UID: \"01fd4ff6-b611-4c7d-9672-9e1f58d50432\") " Apr 20 20:19:25.522341 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2wt\" (UniqueName: \"kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.522341 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.522456 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522396 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "01fd4ff6-b611-4c7d-9672-9e1f58d50432" (UID: "01fd4ff6-b611-4c7d-9672-9e1f58d50432"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:25.522602 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.522584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.524429 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.524404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l" (OuterVolumeSpecName: "kube-api-access-z2f5l") pod "01fd4ff6-b611-4c7d-9672-9e1f58d50432" (UID: "01fd4ff6-b611-4c7d-9672-9e1f58d50432"). InnerVolumeSpecName "kube-api-access-z2f5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:25.533607 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.533580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2wt\" (UniqueName: \"kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.539677 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.539654 2576 generic.go:358] "Generic (PLEG): container finished" podID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" containerID="542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398" exitCode=0 Apr 20 20:19:25.539797 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.539713 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" Apr 20 20:19:25.539797 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.539699 2576 scope.go:117] "RemoveContainer" containerID="542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398" Apr 20 20:19:25.539947 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.539920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:25.542127 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.542098 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.544017 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.543994 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.544730 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.544705 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:25.546021 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.546000 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.548084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.548063 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.548900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.548878 2576 scope.go:117] "RemoveContainer" containerID="542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398" Apr 20 20:19:25.549136 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:19:25.549116 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398\": container with ID starting with 542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398 not found: ID does not exist" containerID="542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398" Apr 20 20:19:25.549194 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.549143 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398"} err="failed to get container status \"542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398\": rpc error: code = NotFound desc = could not find container \"542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398\": container with ID starting with 542e05063b17bd5aa50bba8353fb2e526337021088c40ef37f53589354454398 not found: ID does not exist" Apr 20 20:19:25.550029 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.550009 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.551879 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.551859 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.553872 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.553853 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:25.581453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.581434 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:25.590858 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.590837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:25.623795 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.623712 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2f5l\" (UniqueName: \"kubernetes.io/projected/01fd4ff6-b611-4c7d-9672-9e1f58d50432-kube-api-access-z2f5l\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:25.623931 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.623804 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01fd4ff6-b611-4c7d-9672-9e1f58d50432-extensions-socket-volume\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:25.724113 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.724086 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume\") pod \"061697ba-e757-4a93-8595-f749da70300d\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " Apr 20 20:19:25.724252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.724228 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrbd\" (UniqueName: \"kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd\") pod \"061697ba-e757-4a93-8595-f749da70300d\" (UID: \"061697ba-e757-4a93-8595-f749da70300d\") " Apr 20 20:19:25.724387 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.724362 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "061697ba-e757-4a93-8595-f749da70300d" (UID: "061697ba-e757-4a93-8595-f749da70300d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:25.724467 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.724452 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/061697ba-e757-4a93-8595-f749da70300d-extensions-socket-volume\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:25.726311 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.726279 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd" (OuterVolumeSpecName: "kube-api-access-clrbd") pod "061697ba-e757-4a93-8595-f749da70300d" (UID: "061697ba-e757-4a93-8595-f749da70300d"). InnerVolumeSpecName "kube-api-access-clrbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:25.743255 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.743232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:25.744482 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:25.744458 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a10219d_4115_4469_85f2_5c1d78c8749c.slice/crio-a779b551433d69bdcf5ac05252a553c0ffe3d122c0eead3398fce00f224ef5dd WatchSource:0}: Error finding container a779b551433d69bdcf5ac05252a553c0ffe3d122c0eead3398fce00f224ef5dd: Status 404 returned error can't find the container with id a779b551433d69bdcf5ac05252a553c0ffe3d122c0eead3398fce00f224ef5dd Apr 20 20:19:25.824979 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.824956 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clrbd\" (UniqueName: \"kubernetes.io/projected/061697ba-e757-4a93-8595-f749da70300d-kube-api-access-clrbd\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:25.920246 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:25.920215 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:25.922052 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:25.922027 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod293c47d2_e1aa_4b2b_8e1a_a70399994912.slice/crio-848c3c4a3c6ce3be98dd40acc86b435785ed6673563a3316bb9ea3e4bbb3e389 WatchSource:0}: Error finding container 848c3c4a3c6ce3be98dd40acc86b435785ed6673563a3316bb9ea3e4bbb3e389: Status 404 returned error can't find the container with id 848c3c4a3c6ce3be98dd40acc86b435785ed6673563a3316bb9ea3e4bbb3e389 Apr 20 20:19:26.505372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.505336 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2fn97" Apr 20 20:19:26.507783 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.507715 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.532959 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.532922 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.547657 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.547626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" event={"ID":"7a10219d-4115-4469-85f2-5c1d78c8749c","Type":"ContainerStarted","Data":"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988"} Apr 20 20:19:26.547828 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.547665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" event={"ID":"7a10219d-4115-4469-85f2-5c1d78c8749c","Type":"ContainerStarted","Data":"a779b551433d69bdcf5ac05252a553c0ffe3d122c0eead3398fce00f224ef5dd"} Apr 20 20:19:26.547828 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.547783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:26.549295 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.549273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" Apr 20 20:19:26.549295 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.549285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" event={"ID":"293c47d2-e1aa-4b2b-8e1a-a70399994912","Type":"ContainerStarted","Data":"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1"} Apr 20 20:19:26.549483 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.549315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" event={"ID":"293c47d2-e1aa-4b2b-8e1a-a70399994912","Type":"ContainerStarted","Data":"848c3c4a3c6ce3be98dd40acc86b435785ed6673563a3316bb9ea3e4bbb3e389"} Apr 20 20:19:26.549483 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.549392 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:26.549980 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.549950 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.551938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.551911 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.582197 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.582159 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" podStartSLOduration=1.582148836 podStartE2EDuration="1.582148836s" podCreationTimestamp="2026-04-20 20:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:26.580978982 +0000 UTC m=+482.450757613" watchObservedRunningTime="2026-04-20 20:19:26.582148836 +0000 UTC m=+482.451927467" Apr 20 20:19:26.582960 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.582936 2576 status_manager.go:895] "Failed to get status for pod" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qn6lz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qn6lz\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.584858 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.584836 2576 status_manager.go:895] "Failed to get status for pod" podUID="061697ba-e757-4a93-8595-f749da70300d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-r5smk" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-r5smk\" is forbidden: User \"system:node:ip-10-0-129-247.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-247.ec2.internal' and this object" Apr 20 20:19:26.612289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.612244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" podStartSLOduration=1.612228797 podStartE2EDuration="1.612228797s" podCreationTimestamp="2026-04-20 20:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:26.611032097 +0000 UTC m=+482.480810753" watchObservedRunningTime="2026-04-20 20:19:26.612228797 +0000 UTC m=+482.482007429" Apr 20 20:19:26.786674 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.786644 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fd4ff6-b611-4c7d-9672-9e1f58d50432" path="/var/lib/kubelet/pods/01fd4ff6-b611-4c7d-9672-9e1f58d50432/volumes" Apr 20 20:19:26.787054 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:26.787039 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061697ba-e757-4a93-8595-f749da70300d" path="/var/lib/kubelet/pods/061697ba-e757-4a93-8595-f749da70300d/volumes" Apr 20 20:19:37.555605 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.555575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:37.555971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.555634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:37.622112 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.622081 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:37.622357 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.622316 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" podUID="293c47d2-e1aa-4b2b-8e1a-a70399994912" containerName="manager" containerID="cri-o://504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1" gracePeriod=10 Apr 20 20:19:37.861291 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.861268 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:37.913926 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.913901 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume\") pod \"293c47d2-e1aa-4b2b-8e1a-a70399994912\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " Apr 20 20:19:37.914084 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.914020 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prc8w\" (UniqueName: \"kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w\") pod \"293c47d2-e1aa-4b2b-8e1a-a70399994912\" (UID: \"293c47d2-e1aa-4b2b-8e1a-a70399994912\") " Apr 20 20:19:37.914249 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.914223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "293c47d2-e1aa-4b2b-8e1a-a70399994912" (UID: "293c47d2-e1aa-4b2b-8e1a-a70399994912"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:37.916088 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.916058 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w" (OuterVolumeSpecName: "kube-api-access-prc8w") pod "293c47d2-e1aa-4b2b-8e1a-a70399994912" (UID: "293c47d2-e1aa-4b2b-8e1a-a70399994912"). InnerVolumeSpecName "kube-api-access-prc8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:37.930301 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.930276 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:19:37.930651 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.930637 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="293c47d2-e1aa-4b2b-8e1a-a70399994912" containerName="manager" Apr 20 20:19:37.930693 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.930654 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="293c47d2-e1aa-4b2b-8e1a-a70399994912" containerName="manager" Apr 20 20:19:37.930765 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.930755 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="293c47d2-e1aa-4b2b-8e1a-a70399994912" containerName="manager" Apr 20 20:19:37.933938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.933923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:37.948605 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:37.948579 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:19:38.015391 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.015363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.015553 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.015418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptlp\" (UniqueName: \"kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.015609 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.015551 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prc8w\" (UniqueName: \"kubernetes.io/projected/293c47d2-e1aa-4b2b-8e1a-a70399994912-kube-api-access-prc8w\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:38.015609 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.015576 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/293c47d2-e1aa-4b2b-8e1a-a70399994912-extensions-socket-volume\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:38.116524 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.116441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.116524 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.116502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gptlp\" (UniqueName: \"kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.116890 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.116867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.124527 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.124495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptlp\" (UniqueName: \"kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mttk2\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.243471 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.243434 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:38.577000 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.576975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:19:38.578331 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:38.578303 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae52e34_9ec8_4ab4_a941_23177da25dd6.slice/crio-fad3d97a93d2f9caea38da271ee0f9a3fca617bedba372558494ed3a2f1782f4 WatchSource:0}: Error finding container fad3d97a93d2f9caea38da271ee0f9a3fca617bedba372558494ed3a2f1782f4: Status 404 returned error can't find the container with id fad3d97a93d2f9caea38da271ee0f9a3fca617bedba372558494ed3a2f1782f4 Apr 20 20:19:38.600696 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.600667 2576 generic.go:358] "Generic (PLEG): container finished" podID="293c47d2-e1aa-4b2b-8e1a-a70399994912" containerID="504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1" exitCode=0 Apr 20 20:19:38.600814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.600729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" event={"ID":"293c47d2-e1aa-4b2b-8e1a-a70399994912","Type":"ContainerDied","Data":"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1"} Apr 20 20:19:38.600814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.600766 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" Apr 20 20:19:38.600814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.600789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb" event={"ID":"293c47d2-e1aa-4b2b-8e1a-a70399994912","Type":"ContainerDied","Data":"848c3c4a3c6ce3be98dd40acc86b435785ed6673563a3316bb9ea3e4bbb3e389"} Apr 20 20:19:38.600814 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.600807 2576 scope.go:117] "RemoveContainer" containerID="504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1" Apr 20 20:19:38.602166 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.602143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" event={"ID":"fae52e34-9ec8-4ab4-a941-23177da25dd6","Type":"ContainerStarted","Data":"fad3d97a93d2f9caea38da271ee0f9a3fca617bedba372558494ed3a2f1782f4"} Apr 20 20:19:38.620842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.620819 2576 scope.go:117] "RemoveContainer" containerID="504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1" Apr 20 20:19:38.621122 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:19:38.621103 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1\": container with ID starting with 504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1 not found: ID does not exist" containerID="504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1" Apr 20 20:19:38.621178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.621130 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1"} err="failed to get container status \"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1\": rpc error: code = NotFound desc = could not find container \"504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1\": container with ID starting with 504c384119f8c81a7ae0f0d750db36b865185581d3b128a1ba7c99f7dd6686e1 not found: ID does not exist" Apr 20 20:19:38.630886 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.630863 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:38.635550 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.635529 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-hh8lb"] Apr 20 20:19:38.785482 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:38.785449 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293c47d2-e1aa-4b2b-8e1a-a70399994912" path="/var/lib/kubelet/pods/293c47d2-e1aa-4b2b-8e1a-a70399994912/volumes" Apr 20 20:19:39.608058 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:39.608017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" event={"ID":"fae52e34-9ec8-4ab4-a941-23177da25dd6","Type":"ContainerStarted","Data":"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be"} Apr 20 20:19:39.608526 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:39.608141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:39.636966 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:39.636914 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" podStartSLOduration=2.6368957159999997 podStartE2EDuration="2.636895716s" podCreationTimestamp="2026-04-20 20:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:39.6356292 +0000 UTC m=+495.505407831" watchObservedRunningTime="2026-04-20 20:19:39.636895716 +0000 UTC m=+495.506674349" Apr 20 20:19:50.615119 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:50.615090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:19:50.660526 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:50.660492 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:50.660772 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:50.660728 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" podUID="7a10219d-4115-4469-85f2-5c1d78c8749c" containerName="manager" containerID="cri-o://7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988" gracePeriod=10 Apr 20 20:19:50.903469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:50.903447 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:51.015087 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.015060 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb2wt\" (UniqueName: \"kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt\") pod \"7a10219d-4115-4469-85f2-5c1d78c8749c\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " Apr 20 20:19:51.015393 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.015119 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume\") pod \"7a10219d-4115-4469-85f2-5c1d78c8749c\" (UID: \"7a10219d-4115-4469-85f2-5c1d78c8749c\") " Apr 20 20:19:51.015500 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.015479 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7a10219d-4115-4469-85f2-5c1d78c8749c" (UID: "7a10219d-4115-4469-85f2-5c1d78c8749c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:51.017110 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.017085 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt" (OuterVolumeSpecName: "kube-api-access-vb2wt") pod "7a10219d-4115-4469-85f2-5c1d78c8749c" (UID: "7a10219d-4115-4469-85f2-5c1d78c8749c"). InnerVolumeSpecName "kube-api-access-vb2wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:51.115784 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.115720 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb2wt\" (UniqueName: \"kubernetes.io/projected/7a10219d-4115-4469-85f2-5c1d78c8749c-kube-api-access-vb2wt\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:51.115900 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.115791 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a10219d-4115-4469-85f2-5c1d78c8749c-extensions-socket-volume\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:19:51.658358 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.658320 2576 generic.go:358] "Generic (PLEG): container finished" podID="7a10219d-4115-4469-85f2-5c1d78c8749c" containerID="7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988" exitCode=0 Apr 20 20:19:51.658839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.658413 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" Apr 20 20:19:51.658839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.658416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" event={"ID":"7a10219d-4115-4469-85f2-5c1d78c8749c","Type":"ContainerDied","Data":"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988"} Apr 20 20:19:51.658839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.658468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr" event={"ID":"7a10219d-4115-4469-85f2-5c1d78c8749c","Type":"ContainerDied","Data":"a779b551433d69bdcf5ac05252a553c0ffe3d122c0eead3398fce00f224ef5dd"} Apr 20 20:19:51.658839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.658490 2576 scope.go:117] "RemoveContainer" containerID="7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988" Apr 20 20:19:51.667990 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.667836 2576 scope.go:117] "RemoveContainer" containerID="7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988" Apr 20 20:19:51.668079 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:19:51.668059 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988\": container with ID starting with 7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988 not found: ID does not exist" containerID="7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988" Apr 20 20:19:51.668121 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.668087 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988"} err="failed to get container status \"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988\": rpc error: code = NotFound desc = could not find container \"7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988\": container with ID starting with 7348d74fe9528bbe6c0576a01d114ebf1fb0ce7f3d89d4990247350d0f9ae988 not found: ID does not exist" Apr 20 20:19:51.680835 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.680810 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:51.686861 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:51.686838 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n5fbr"] Apr 20 20:19:52.785849 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:52.785816 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a10219d-4115-4469-85f2-5c1d78c8749c" path="/var/lib/kubelet/pods/7a10219d-4115-4469-85f2-5c1d78c8749c/volumes" Apr 20 20:19:53.808761 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.808712 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn"] Apr 20 20:19:53.809474 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.809446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a10219d-4115-4469-85f2-5c1d78c8749c" containerName="manager" Apr 20 20:19:53.809474 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.809474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a10219d-4115-4469-85f2-5c1d78c8749c" containerName="manager" Apr 20 20:19:53.809634 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.809610 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a10219d-4115-4469-85f2-5c1d78c8749c" containerName="manager" Apr 20 20:19:53.814461 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.814436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.817272 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.817247 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-5sclf\"" Apr 20 20:19:53.822268 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.822243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn"] Apr 20 20:19:53.939459 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939627 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939627 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mn8v\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-kube-api-access-4mn8v\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939627 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e7496337-dfbb-45af-938b-602bc0673098-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939817 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939817 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939914 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939914 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e7496337-dfbb-45af-938b-602bc0673098-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:53.939993 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:53.939948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.041259 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mn8v\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-kube-api-access-4mn8v\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.041259 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e7496337-dfbb-45af-938b-602bc0673098-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.041539 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.041612 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e7496337-dfbb-45af-938b-602bc0673098-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.041982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.042254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.042473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.042574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e7496337-dfbb-45af-938b-602bc0673098-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.042689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.045292 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.045218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e7496337-dfbb-45af-938b-602bc0673098-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.047241 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.047214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e7496337-dfbb-45af-938b-602bc0673098-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.049911 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.049886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.050016 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.049958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mn8v\" (UniqueName: \"kubernetes.io/projected/e7496337-dfbb-45af-938b-602bc0673098-kube-api-access-4mn8v\") pod \"maas-default-gateway-openshift-default-845c6b4b48-p2bbn\" (UID: \"e7496337-dfbb-45af-938b-602bc0673098\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.129038 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.128983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:54.267616 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.267586 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn"] Apr 20 20:19:54.268800 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:19:54.268774 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7496337_dfbb_45af_938b_602bc0673098.slice/crio-53b5950f93b863cd57f52c30b2e9da582bfbdb1bac07a8655d0c254700a75bcb WatchSource:0}: Error finding container 53b5950f93b863cd57f52c30b2e9da582bfbdb1bac07a8655d0c254700a75bcb: Status 404 returned error can't find the container with id 53b5950f93b863cd57f52c30b2e9da582bfbdb1bac07a8655d0c254700a75bcb Apr 20 20:19:54.672797 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:54.672766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" event={"ID":"e7496337-dfbb-45af-938b-602bc0673098","Type":"ContainerStarted","Data":"53b5950f93b863cd57f52c30b2e9da582bfbdb1bac07a8655d0c254700a75bcb"} Apr 20 20:19:56.812422 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:56.812389 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 20:19:56.812704 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:56.812453 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 20:19:56.812704 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:56.812478 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 20:19:57.687934 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:57.687842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" event={"ID":"e7496337-dfbb-45af-938b-602bc0673098","Type":"ContainerStarted","Data":"2cfb666f0886caed8af5acc3c0f538d18d6daf790c282a45ca98f1ee9ec9c559"} Apr 20 20:19:57.709153 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:57.709109 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" podStartSLOduration=2.167548022 podStartE2EDuration="4.709096198s" podCreationTimestamp="2026-04-20 20:19:53 +0000 UTC" firstStartedPulling="2026-04-20 20:19:54.270620783 +0000 UTC m=+510.140399399" lastFinishedPulling="2026-04-20 20:19:56.812168951 +0000 UTC m=+512.681947575" observedRunningTime="2026-04-20 20:19:57.706537638 +0000 UTC m=+513.576316271" watchObservedRunningTime="2026-04-20 20:19:57.709096198 +0000 UTC m=+513.578874828" Apr 20 20:19:58.129566 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:58.129536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:58.134105 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:58.134074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:58.692400 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:58.692373 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:19:58.693315 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:19:58.693294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-p2bbn" Apr 20 20:20:10.164123 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.164049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:10.167215 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.167199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.169707 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.169681 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hfnq\"" Apr 20 20:20:10.169826 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.169709 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 20:20:10.177596 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.177572 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:10.259975 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.259945 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:10.268938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.268917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9md\" (UniqueName: \"kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.269046 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.268989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.369677 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.369649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.369843 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.369710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9md\" (UniqueName: \"kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.370310 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.370285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.377617 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.377598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9md\" (UniqueName: \"kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md\") pod \"limitador-limitador-7d549b5b-kww9v\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.479085 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.479007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:10.606242 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.606209 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:10.608302 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:20:10.608273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d12a33_7168_4b4c_9f0e_2e91fefa779e.slice/crio-3f975e99e91099fa6e0427459490a702ea7022656f203439852a0cb3edb0bc32 WatchSource:0}: Error finding container 3f975e99e91099fa6e0427459490a702ea7022656f203439852a0cb3edb0bc32: Status 404 returned error can't find the container with id 3f975e99e91099fa6e0427459490a702ea7022656f203439852a0cb3edb0bc32 Apr 20 20:20:10.742946 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:10.742866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" event={"ID":"e1d12a33-7168-4b4c-9f0e-2e91fefa779e","Type":"ContainerStarted","Data":"3f975e99e91099fa6e0427459490a702ea7022656f203439852a0cb3edb0bc32"} Apr 20 20:20:13.760349 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:13.760321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" event={"ID":"e1d12a33-7168-4b4c-9f0e-2e91fefa779e","Type":"ContainerStarted","Data":"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c"} Apr 20 20:20:13.760953 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:13.760456 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:13.778952 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:13.778877 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" podStartSLOduration=0.872448276 podStartE2EDuration="3.778863621s" podCreationTimestamp="2026-04-20 20:20:10 +0000 UTC" firstStartedPulling="2026-04-20 20:20:10.610031259 +0000 UTC m=+526.479809867" lastFinishedPulling="2026-04-20 20:20:13.516446603 +0000 UTC m=+529.386225212" observedRunningTime="2026-04-20 20:20:13.777056963 +0000 UTC m=+529.646835595" watchObservedRunningTime="2026-04-20 20:20:13.778863621 +0000 UTC m=+529.648642251" Apr 20 20:20:24.719061 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:24.719039 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:24.719313 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:24.719249 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" podUID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" containerName="limitador" containerID="cri-o://ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c" gracePeriod=30 Apr 20 20:20:24.719837 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:24.719816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:25.265618 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.265593 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:25.388899 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.388826 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file\") pod \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " Apr 20 20:20:25.388899 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.388880 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9md\" (UniqueName: \"kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md\") pod \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\" (UID: \"e1d12a33-7168-4b4c-9f0e-2e91fefa779e\") " Apr 20 20:20:25.389363 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.389337 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file" (OuterVolumeSpecName: "config-file") pod "e1d12a33-7168-4b4c-9f0e-2e91fefa779e" (UID: "e1d12a33-7168-4b4c-9f0e-2e91fefa779e"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:20:25.392018 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.391990 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md" (OuterVolumeSpecName: "kube-api-access-2w9md") pod "e1d12a33-7168-4b4c-9f0e-2e91fefa779e" (UID: "e1d12a33-7168-4b4c-9f0e-2e91fefa779e"). InnerVolumeSpecName "kube-api-access-2w9md". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:25.490081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.490050 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-config-file\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:20:25.490081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.490081 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w9md\" (UniqueName: \"kubernetes.io/projected/e1d12a33-7168-4b4c-9f0e-2e91fefa779e-kube-api-access-2w9md\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:20:25.811081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.811051 2576 generic.go:358] "Generic (PLEG): container finished" podID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" containerID="ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c" exitCode=0 Apr 20 20:20:25.811476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.811120 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" Apr 20 20:20:25.811476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.811139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" event={"ID":"e1d12a33-7168-4b4c-9f0e-2e91fefa779e","Type":"ContainerDied","Data":"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c"} Apr 20 20:20:25.811476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.811178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kww9v" event={"ID":"e1d12a33-7168-4b4c-9f0e-2e91fefa779e","Type":"ContainerDied","Data":"3f975e99e91099fa6e0427459490a702ea7022656f203439852a0cb3edb0bc32"} Apr 20 20:20:25.811476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.811193 2576 scope.go:117] "RemoveContainer" containerID="ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c" Apr 20 20:20:25.819857 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.819838 2576 scope.go:117] "RemoveContainer" containerID="ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c" Apr 20 20:20:25.820108 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:20:25.820091 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c\": container with ID starting with ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c not found: ID does not exist" containerID="ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c" Apr 20 20:20:25.820149 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.820117 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c"} err="failed to get container status \"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c\": rpc error: code = NotFound desc = could not find container \"ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c\": container with ID starting with ddde60d645d92ad4bf9ed7d59bc1bb4e59ba20ee3b1296839ed2c139a885f80c not found: ID does not exist" Apr 20 20:20:25.833729 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.833709 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:25.837653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:25.837633 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kww9v"] Apr 20 20:20:26.785248 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:26.785213 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" path="/var/lib/kubelet/pods/e1d12a33-7168-4b4c-9f0e-2e91fefa779e/volumes" Apr 20 20:20:29.428074 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.428043 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-d6dc7"] Apr 20 20:20:29.428557 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.428538 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" containerName="limitador" Apr 20 20:20:29.428638 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.428561 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" containerName="limitador" Apr 20 20:20:29.428690 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.428682 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1d12a33-7168-4b4c-9f0e-2e91fefa779e" containerName="limitador" Apr 20 20:20:29.438120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.438088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.438799 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.438516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d6dc7"] Apr 20 20:20:29.440720 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.440695 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 20:20:29.440847 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.440779 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-fl4k6\"" Apr 20 20:20:29.522679 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.522652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e060f66-5e39-4bc3-9b8b-d9777603aa02-data\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.522810 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.522757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc46n\" (UniqueName: \"kubernetes.io/projected/1e060f66-5e39-4bc3-9b8b-d9777603aa02-kube-api-access-sc46n\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.623390 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.623362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc46n\" (UniqueName: \"kubernetes.io/projected/1e060f66-5e39-4bc3-9b8b-d9777603aa02-kube-api-access-sc46n\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.623520 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.623407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e060f66-5e39-4bc3-9b8b-d9777603aa02-data\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.623706 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.623689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e060f66-5e39-4bc3-9b8b-d9777603aa02-data\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.631184 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.631159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc46n\" (UniqueName: \"kubernetes.io/projected/1e060f66-5e39-4bc3-9b8b-d9777603aa02-kube-api-access-sc46n\") pod \"postgres-868db5846d-d6dc7\" (UID: \"1e060f66-5e39-4bc3-9b8b-d9777603aa02\") " pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.749525 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.749465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:29.873585 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:29.873551 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d6dc7"] Apr 20 20:20:29.874784 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:20:29.874754 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e060f66_5e39_4bc3_9b8b_d9777603aa02.slice/crio-2889093b7439997536d47a4cbacb7fcf260afc3c45a1b45cbb060504f5bbb97d WatchSource:0}: Error finding container 2889093b7439997536d47a4cbacb7fcf260afc3c45a1b45cbb060504f5bbb97d: Status 404 returned error can't find the container with id 2889093b7439997536d47a4cbacb7fcf260afc3c45a1b45cbb060504f5bbb97d Apr 20 20:20:30.832380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:30.832344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d6dc7" event={"ID":"1e060f66-5e39-4bc3-9b8b-d9777603aa02","Type":"ContainerStarted","Data":"2889093b7439997536d47a4cbacb7fcf260afc3c45a1b45cbb060504f5bbb97d"} Apr 20 20:20:36.858232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:36.858143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d6dc7" event={"ID":"1e060f66-5e39-4bc3-9b8b-d9777603aa02","Type":"ContainerStarted","Data":"2dceec28c07271c402f43ccff87365a9a0601baf84292d88c8a7e55c86bd970a"} Apr 20 20:20:36.858232 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:36.858186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:36.873794 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:36.873731 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-d6dc7" podStartSLOduration=1.205705289 podStartE2EDuration="7.873718383s" podCreationTimestamp="2026-04-20 20:20:29 +0000 UTC" firstStartedPulling="2026-04-20 20:20:29.876000315 +0000 UTC m=+545.745778927" lastFinishedPulling="2026-04-20 20:20:36.54401341 +0000 UTC m=+552.413792021" observedRunningTime="2026-04-20 20:20:36.871944941 +0000 UTC m=+552.741723572" watchObservedRunningTime="2026-04-20 20:20:36.873718383 +0000 UTC m=+552.743497013" Apr 20 20:20:42.892643 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:42.892616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-d6dc7" Apr 20 20:20:45.522411 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.522379 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:45.529642 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.529619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:45.532556 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.532539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-ghmlq\"" Apr 20 20:20:45.539764 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.539722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:45.553040 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.553003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxrj\" (UniqueName: \"kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj\") pod \"maas-controller-6d4c8f55f9-rdtdq\" (UID: \"3fdd5527-3f25-49f5-8544-9fa2ff40a054\") " pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:45.653870 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.653845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxrj\" (UniqueName: \"kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj\") pod \"maas-controller-6d4c8f55f9-rdtdq\" (UID: \"3fdd5527-3f25-49f5-8544-9fa2ff40a054\") " pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:45.664428 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.664401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxrj\" (UniqueName: \"kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj\") pod \"maas-controller-6d4c8f55f9-rdtdq\" (UID: \"3fdd5527-3f25-49f5-8544-9fa2ff40a054\") " pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:45.783267 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.783241 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:45.783420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.783409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:45.905863 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:45.905836 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:45.906900 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:20:45.906872 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdd5527_3f25_49f5_8544_9fa2ff40a054.slice/crio-6afcc5d7daf4be68fcdc81f0532c7b8e174f40425dd91ecfea7a07ef3e9e34dd WatchSource:0}: Error finding container 6afcc5d7daf4be68fcdc81f0532c7b8e174f40425dd91ecfea7a07ef3e9e34dd: Status 404 returned error can't find the container with id 6afcc5d7daf4be68fcdc81f0532c7b8e174f40425dd91ecfea7a07ef3e9e34dd Apr 20 20:20:46.904353 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:46.904312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" event={"ID":"3fdd5527-3f25-49f5-8544-9fa2ff40a054","Type":"ContainerStarted","Data":"6afcc5d7daf4be68fcdc81f0532c7b8e174f40425dd91ecfea7a07ef3e9e34dd"} Apr 20 20:20:48.916477 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:48.916433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" event={"ID":"3fdd5527-3f25-49f5-8544-9fa2ff40a054","Type":"ContainerStarted","Data":"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28"} Apr 20 20:20:48.916938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:48.916478 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" podUID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" containerName="manager" containerID="cri-o://8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28" gracePeriod=10 Apr 20 20:20:48.916938 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:48.916517 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:48.933578 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:48.933539 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" podStartSLOduration=1.675128776 podStartE2EDuration="3.933528351s" podCreationTimestamp="2026-04-20 20:20:45 +0000 UTC" firstStartedPulling="2026-04-20 20:20:45.908122569 +0000 UTC m=+561.777901178" lastFinishedPulling="2026-04-20 20:20:48.166522141 +0000 UTC m=+564.036300753" observedRunningTime="2026-04-20 20:20:48.932434344 +0000 UTC m=+564.802212974" watchObservedRunningTime="2026-04-20 20:20:48.933528351 +0000 UTC m=+564.803307008" Apr 20 20:20:49.155105 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.155082 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:49.184120 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.184065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lxrj\" (UniqueName: \"kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj\") pod \"3fdd5527-3f25-49f5-8544-9fa2ff40a054\" (UID: \"3fdd5527-3f25-49f5-8544-9fa2ff40a054\") " Apr 20 20:20:49.186289 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.186265 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj" (OuterVolumeSpecName: "kube-api-access-2lxrj") pod "3fdd5527-3f25-49f5-8544-9fa2ff40a054" (UID: "3fdd5527-3f25-49f5-8544-9fa2ff40a054"). InnerVolumeSpecName "kube-api-access-2lxrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:49.285600 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.285575 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lxrj\" (UniqueName: \"kubernetes.io/projected/3fdd5527-3f25-49f5-8544-9fa2ff40a054-kube-api-access-2lxrj\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:20:49.921543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.921512 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" containerID="8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28" exitCode=0 Apr 20 20:20:49.922004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.921579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" Apr 20 20:20:49.922004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.921594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" event={"ID":"3fdd5527-3f25-49f5-8544-9fa2ff40a054","Type":"ContainerDied","Data":"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28"} Apr 20 20:20:49.922004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.921627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rdtdq" event={"ID":"3fdd5527-3f25-49f5-8544-9fa2ff40a054","Type":"ContainerDied","Data":"6afcc5d7daf4be68fcdc81f0532c7b8e174f40425dd91ecfea7a07ef3e9e34dd"} Apr 20 20:20:49.922004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.921643 2576 scope.go:117] "RemoveContainer" containerID="8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28" Apr 20 20:20:49.930983 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.930967 2576 scope.go:117] "RemoveContainer" containerID="8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28" Apr 20 20:20:49.931255 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:20:49.931237 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28\": container with ID starting with 8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28 not found: ID does not exist" containerID="8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28" Apr 20 20:20:49.931300 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.931262 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28"} err="failed to get container status \"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28\": rpc error: code = NotFound desc = could not find container \"8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28\": container with ID starting with 8209401f6782b71e742a6b2402320b019a4f7966a71388d420d141a4e30aee28 not found: ID does not exist" Apr 20 20:20:49.942594 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.942568 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:49.945580 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:49.945561 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rdtdq"] Apr 20 20:20:50.792266 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:20:50.792223 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" path="/var/lib/kubelet/pods/3fdd5527-3f25-49f5-8544-9fa2ff40a054/volumes" Apr 20 20:21:24.711420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:24.711390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:21:24.713061 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:24.713036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:21:24.715088 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:24.715069 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:21:24.716487 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:24.716470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:21:31.850782 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.850752 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh"] Apr 20 20:21:31.853142 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.851103 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" containerName="manager" Apr 20 20:21:31.853142 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.851113 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" containerName="manager" Apr 20 20:21:31.853142 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.851181 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fdd5527-3f25-49f5-8544-9fa2ff40a054" containerName="manager" Apr 20 20:21:31.854194 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.854179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:31.857954 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.857935 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-6d4f8\"" Apr 20 20:21:31.858086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.857936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 20:21:31.858086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.857939 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 20:21:31.858086 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.857939 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 20:21:31.863101 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:31.863078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh"] Apr 20 20:21:32.006308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fac39e3-3529-4a24-ba04-358e08f201a4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.006469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.006469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.006469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.006469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7g62\" (UniqueName: \"kubernetes.io/projected/1fac39e3-3529-4a24-ba04-358e08f201a4-kube-api-access-m7g62\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.006469 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.006450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107033 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.106953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107033 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.106998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7g62\" (UniqueName: \"kubernetes.io/projected/1fac39e3-3529-4a24-ba04-358e08f201a4-kube-api-access-m7g62\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107240 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107240 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fac39e3-3529-4a24-ba04-358e08f201a4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107240 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107240 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.107868 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.108031 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.107991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.108268 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.108063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.109973 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.109952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fac39e3-3529-4a24-ba04-358e08f201a4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.110628 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.110602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fac39e3-3529-4a24-ba04-358e08f201a4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.115213 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.115193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7g62\" (UniqueName: \"kubernetes.io/projected/1fac39e3-3529-4a24-ba04-358e08f201a4-kube-api-access-m7g62\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh\" (UID: \"1fac39e3-3529-4a24-ba04-358e08f201a4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.165453 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.165423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:32.303434 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.303264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh"] Apr 20 20:21:32.305240 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:21:32.305213 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fac39e3_3529_4a24_ba04_358e08f201a4.slice/crio-2013a5cc6bf80eb28a2aadbc07e0b2ae9a50f26b5e56b3ea3db04a9528a56bf8 WatchSource:0}: Error finding container 2013a5cc6bf80eb28a2aadbc07e0b2ae9a50f26b5e56b3ea3db04a9528a56bf8: Status 404 returned error can't find the container with id 2013a5cc6bf80eb28a2aadbc07e0b2ae9a50f26b5e56b3ea3db04a9528a56bf8 Apr 20 20:21:32.309346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:32.307500 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:21:33.102512 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:33.102462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" event={"ID":"1fac39e3-3529-4a24-ba04-358e08f201a4","Type":"ContainerStarted","Data":"2013a5cc6bf80eb28a2aadbc07e0b2ae9a50f26b5e56b3ea3db04a9528a56bf8"} Apr 20 20:21:35.640036 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.639997 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5"] Apr 20 20:21:35.643566 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.643550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.645925 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.645906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 20:21:35.653934 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.653907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5"] Apr 20 20:21:35.736140 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7k8\" (UniqueName: \"kubernetes.io/projected/2099d162-44e0-4f93-8d46-42982261e0ef-kube-api-access-gn7k8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.736293 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2099d162-44e0-4f93-8d46-42982261e0ef-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.736293 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.736293 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.736293 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.736512 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.736348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837579 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7k8\" (UniqueName: \"kubernetes.io/projected/2099d162-44e0-4f93-8d46-42982261e0ef-kube-api-access-gn7k8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2099d162-44e0-4f93-8d46-42982261e0ef-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837702 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.837920 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.837897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.838026 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.838007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.838077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.838044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.838128 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.838082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.840049 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.840029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2099d162-44e0-4f93-8d46-42982261e0ef-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.840308 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.840289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2099d162-44e0-4f93-8d46-42982261e0ef-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.845080 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.845060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7k8\" (UniqueName: \"kubernetes.io/projected/2099d162-44e0-4f93-8d46-42982261e0ef-kube-api-access-gn7k8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hv8x5\" (UID: \"2099d162-44e0-4f93-8d46-42982261e0ef\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:35.954276 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:35.954200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:36.115778 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:36.115754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5"] Apr 20 20:21:36.118266 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:21:36.118236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2099d162_44e0_4f93_8d46_42982261e0ef.slice/crio-2972344ddb805fa38a41a111f0a0ef62afe5216a43c2f319140baa723017d8d7 WatchSource:0}: Error finding container 2972344ddb805fa38a41a111f0a0ef62afe5216a43c2f319140baa723017d8d7: Status 404 returned error can't find the container with id 2972344ddb805fa38a41a111f0a0ef62afe5216a43c2f319140baa723017d8d7 Apr 20 20:21:37.123928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:37.123771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" event={"ID":"2099d162-44e0-4f93-8d46-42982261e0ef","Type":"ContainerStarted","Data":"2972344ddb805fa38a41a111f0a0ef62afe5216a43c2f319140baa723017d8d7"} Apr 20 20:21:41.144571 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:41.144533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" event={"ID":"1fac39e3-3529-4a24-ba04-358e08f201a4","Type":"ContainerStarted","Data":"d8481d9e1e3bf787f8deae85be9f6b65be907c2e6b80216df32baae44e87356d"} Apr 20 20:21:41.146115 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:41.146085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" event={"ID":"2099d162-44e0-4f93-8d46-42982261e0ef","Type":"ContainerStarted","Data":"e7302a0f1e78870f8aa87fd32b2855516e0cb3a125be5581cb2fbc85d3799eb5"} Apr 20 20:21:46.170475 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:46.170441 2576 generic.go:358] "Generic (PLEG): container finished" podID="1fac39e3-3529-4a24-ba04-358e08f201a4" containerID="d8481d9e1e3bf787f8deae85be9f6b65be907c2e6b80216df32baae44e87356d" exitCode=0 Apr 20 20:21:46.170859 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:46.170518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" event={"ID":"1fac39e3-3529-4a24-ba04-358e08f201a4","Type":"ContainerDied","Data":"d8481d9e1e3bf787f8deae85be9f6b65be907c2e6b80216df32baae44e87356d"} Apr 20 20:21:46.172023 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:46.172002 2576 generic.go:358] "Generic (PLEG): container finished" podID="2099d162-44e0-4f93-8d46-42982261e0ef" containerID="e7302a0f1e78870f8aa87fd32b2855516e0cb3a125be5581cb2fbc85d3799eb5" exitCode=0 Apr 20 20:21:46.172101 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:46.172079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" event={"ID":"2099d162-44e0-4f93-8d46-42982261e0ef","Type":"ContainerDied","Data":"e7302a0f1e78870f8aa87fd32b2855516e0cb3a125be5581cb2fbc85d3799eb5"} Apr 20 20:21:48.181950 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.181913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" event={"ID":"2099d162-44e0-4f93-8d46-42982261e0ef","Type":"ContainerStarted","Data":"dbc78b1fb1d8b38b8bcf13852c6e1a2365df93dd49effef09615f98d493e7236"} Apr 20 20:21:48.182353 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.182139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:21:48.183648 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.183630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" event={"ID":"1fac39e3-3529-4a24-ba04-358e08f201a4","Type":"ContainerStarted","Data":"a4286586f1afae1a2b4ed5e0f77f597d8bf30cc5ae1b29ada0f29e1718f51a51"} Apr 20 20:21:48.183830 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.183815 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:48.202022 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.201976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" podStartSLOduration=1.851953929 podStartE2EDuration="13.201961725s" podCreationTimestamp="2026-04-20 20:21:35 +0000 UTC" firstStartedPulling="2026-04-20 20:21:36.120096458 +0000 UTC m=+611.989875074" lastFinishedPulling="2026-04-20 20:21:47.470104246 +0000 UTC m=+623.339882870" observedRunningTime="2026-04-20 20:21:48.19873252 +0000 UTC m=+624.068511152" watchObservedRunningTime="2026-04-20 20:21:48.201961725 +0000 UTC m=+624.071740357" Apr 20 20:21:48.216815 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:48.216729 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" podStartSLOduration=2.049942794 podStartE2EDuration="17.216713606s" podCreationTimestamp="2026-04-20 20:21:31 +0000 UTC" firstStartedPulling="2026-04-20 20:21:32.307667845 +0000 UTC m=+608.177446460" lastFinishedPulling="2026-04-20 20:21:47.474438661 +0000 UTC m=+623.344217272" observedRunningTime="2026-04-20 20:21:48.215621428 +0000 UTC m=+624.085400061" watchObservedRunningTime="2026-04-20 20:21:48.216713606 +0000 UTC m=+624.086492238" Apr 20 20:21:59.203216 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:59.203189 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh" Apr 20 20:21:59.204757 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:21:59.204714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hv8x5" Apr 20 20:22:01.146148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.146111 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz"] Apr 20 20:22:01.162231 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.162205 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz"] Apr 20 20:22:01.162362 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.162317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.164897 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.164871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 20:22:01.277119 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.277119 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.277351 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sks8m\" (UniqueName: \"kubernetes.io/projected/bcf372b0-470c-4bad-a0db-eb3d9e627993-kube-api-access-sks8m\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.277351 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf372b0-470c-4bad-a0db-eb3d9e627993-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.277351 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.277351 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.277316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.377893 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.377861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sks8m\" (UniqueName: \"kubernetes.io/projected/bcf372b0-470c-4bad-a0db-eb3d9e627993-kube-api-access-sks8m\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.377920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf372b0-470c-4bad-a0db-eb3d9e627993-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.377940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.377957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.377996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378028 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.378012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378376 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.378354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378445 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.378369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.378482 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.378463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.380420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.380397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bcf372b0-470c-4bad-a0db-eb3d9e627993-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.380846 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.380828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf372b0-470c-4bad-a0db-eb3d9e627993-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.386059 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.386036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sks8m\" (UniqueName: \"kubernetes.io/projected/bcf372b0-470c-4bad-a0db-eb3d9e627993-kube-api-access-sks8m\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz\" (UID: \"bcf372b0-470c-4bad-a0db-eb3d9e627993\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.473521 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.473472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:01.616366 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:01.616306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz"] Apr 20 20:22:01.617815 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:22:01.617787 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf372b0_470c_4bad_a0db_eb3d9e627993.slice/crio-3d855b3a9cf35a887193364bef584e577c73e9a968baaacb5f2dd45eadcbe180 WatchSource:0}: Error finding container 3d855b3a9cf35a887193364bef584e577c73e9a968baaacb5f2dd45eadcbe180: Status 404 returned error can't find the container with id 3d855b3a9cf35a887193364bef584e577c73e9a968baaacb5f2dd45eadcbe180 Apr 20 20:22:02.244429 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:02.244398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" event={"ID":"bcf372b0-470c-4bad-a0db-eb3d9e627993","Type":"ContainerStarted","Data":"e0c98aa7fd4944943ed3d1715d1a060547393908b385f16cccd315052a964021"} Apr 20 20:22:02.244842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:02.244436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" event={"ID":"bcf372b0-470c-4bad-a0db-eb3d9e627993","Type":"ContainerStarted","Data":"3d855b3a9cf35a887193364bef584e577c73e9a968baaacb5f2dd45eadcbe180"} Apr 20 20:22:04.942928 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:04.942886 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh"] Apr 20 20:22:04.961567 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:04.961538 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh"] Apr 20 20:22:04.961722 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:04.961646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:04.964111 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:04.964085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 20:22:05.013073 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc859598-9366-4fdf-a5e7-0d0705a14680-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.013186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.013186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.013186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.013186 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg89h\" (UniqueName: \"kubernetes.io/projected/bc859598-9366-4fdf-a5e7-0d0705a14680-kube-api-access-mg89h\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.013321 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.013195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114420 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114726 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg89h\" (UniqueName: \"kubernetes.io/projected/bc859598-9366-4fdf-a5e7-0d0705a14680-kube-api-access-mg89h\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114726 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114726 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc859598-9366-4fdf-a5e7-0d0705a14680-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114726 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.114966 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.115014 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.115050 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.114992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.116896 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.116870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc859598-9366-4fdf-a5e7-0d0705a14680-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.117188 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.117169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc859598-9366-4fdf-a5e7-0d0705a14680-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.122828 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.122799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg89h\" (UniqueName: \"kubernetes.io/projected/bc859598-9366-4fdf-a5e7-0d0705a14680-kube-api-access-mg89h\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh\" (UID: \"bc859598-9366-4fdf-a5e7-0d0705a14680\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.271809 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.271689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:05.412572 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:05.412536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh"] Apr 20 20:22:06.265843 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:06.265809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" event={"ID":"bc859598-9366-4fdf-a5e7-0d0705a14680","Type":"ContainerStarted","Data":"ff88b77d2335280f6d83a3d03fe51450b6040cf908e90b0e4380b7ca9d4f5f71"} Apr 20 20:22:06.266215 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:06.265852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" event={"ID":"bc859598-9366-4fdf-a5e7-0d0705a14680","Type":"ContainerStarted","Data":"2f552900bc4c9b71166f02f00444f1bb8ba04fb6548808f98f548252e02d8e6c"} Apr 20 20:22:07.272093 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:07.272056 2576 generic.go:358] "Generic (PLEG): container finished" podID="bcf372b0-470c-4bad-a0db-eb3d9e627993" containerID="e0c98aa7fd4944943ed3d1715d1a060547393908b385f16cccd315052a964021" exitCode=0 Apr 20 20:22:07.272504 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:07.272142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" event={"ID":"bcf372b0-470c-4bad-a0db-eb3d9e627993","Type":"ContainerDied","Data":"e0c98aa7fd4944943ed3d1715d1a060547393908b385f16cccd315052a964021"} Apr 20 20:22:08.278148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:08.278110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" event={"ID":"bcf372b0-470c-4bad-a0db-eb3d9e627993","Type":"ContainerStarted","Data":"69f48867518dc7041b7ace8acab0d0bf1e0796dafc64f3844aff8b3721bc8423"} Apr 20 20:22:08.278522 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:08.278342 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:08.298089 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:08.298044 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" podStartSLOduration=6.999180546 podStartE2EDuration="7.298027331s" podCreationTimestamp="2026-04-20 20:22:01 +0000 UTC" firstStartedPulling="2026-04-20 20:22:07.273026101 +0000 UTC m=+643.142804710" lastFinishedPulling="2026-04-20 20:22:07.571872872 +0000 UTC m=+643.441651495" observedRunningTime="2026-04-20 20:22:08.294717269 +0000 UTC m=+644.164495901" watchObservedRunningTime="2026-04-20 20:22:08.298027331 +0000 UTC m=+644.167805962" Apr 20 20:22:14.307657 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:14.307614 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc859598-9366-4fdf-a5e7-0d0705a14680" containerID="ff88b77d2335280f6d83a3d03fe51450b6040cf908e90b0e4380b7ca9d4f5f71" exitCode=0 Apr 20 20:22:14.308152 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:14.307671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" event={"ID":"bc859598-9366-4fdf-a5e7-0d0705a14680","Type":"ContainerDied","Data":"ff88b77d2335280f6d83a3d03fe51450b6040cf908e90b0e4380b7ca9d4f5f71"} Apr 20 20:22:15.315560 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:15.315526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" event={"ID":"bc859598-9366-4fdf-a5e7-0d0705a14680","Type":"ContainerStarted","Data":"c55a709d338b205351d3a7d8b780e0eb8375445106377728bacc51c489b3a840"} Apr 20 20:22:15.315988 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:15.315722 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:22:15.335543 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:15.335492 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" podStartSLOduration=11.155565846 podStartE2EDuration="11.335479177s" podCreationTimestamp="2026-04-20 20:22:04 +0000 UTC" firstStartedPulling="2026-04-20 20:22:14.308523998 +0000 UTC m=+650.178302613" lastFinishedPulling="2026-04-20 20:22:14.488437335 +0000 UTC m=+650.358215944" observedRunningTime="2026-04-20 20:22:15.331573981 +0000 UTC m=+651.201352613" watchObservedRunningTime="2026-04-20 20:22:15.335479177 +0000 UTC m=+651.205257808" Apr 20 20:22:19.296908 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:19.296879 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz" Apr 20 20:22:26.332615 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:22:26.332581 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh" Apr 20 20:26:24.744589 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:26:24.744558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:26:24.747999 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:26:24.747976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:26:24.748546 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:26:24.748528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:26:24.751877 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:26:24.751856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:31:24.778368 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:31:24.778342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:31:24.782299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:31:24.782277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:31:24.783929 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:31:24.783909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:31:24.787903 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:31:24.787885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:35:01.474842 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.474813 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:35:01.475345 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.475098 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" podUID="fae52e34-9ec8-4ab4-a941-23177da25dd6" containerName="manager" containerID="cri-o://e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be" gracePeriod=10 Apr 20 20:35:01.821684 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.821662 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:35:01.932787 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.932717 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gptlp\" (UniqueName: \"kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp\") pod \"fae52e34-9ec8-4ab4-a941-23177da25dd6\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " Apr 20 20:35:01.932966 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.932861 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume\") pod \"fae52e34-9ec8-4ab4-a941-23177da25dd6\" (UID: \"fae52e34-9ec8-4ab4-a941-23177da25dd6\") " Apr 20 20:35:01.933227 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.933201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "fae52e34-9ec8-4ab4-a941-23177da25dd6" (UID: "fae52e34-9ec8-4ab4-a941-23177da25dd6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:35:01.934959 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:01.934936 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp" (OuterVolumeSpecName: "kube-api-access-gptlp") pod "fae52e34-9ec8-4ab4-a941-23177da25dd6" (UID: "fae52e34-9ec8-4ab4-a941-23177da25dd6"). InnerVolumeSpecName "kube-api-access-gptlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:35:02.034071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.034043 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fae52e34-9ec8-4ab4-a941-23177da25dd6-extensions-socket-volume\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:35:02.034071 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.034070 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gptlp\" (UniqueName: \"kubernetes.io/projected/fae52e34-9ec8-4ab4-a941-23177da25dd6-kube-api-access-gptlp\") on node \"ip-10-0-129-247.ec2.internal\" DevicePath \"\"" Apr 20 20:35:02.464528 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.464430 2576 generic.go:358] "Generic (PLEG): container finished" podID="fae52e34-9ec8-4ab4-a941-23177da25dd6" containerID="e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be" exitCode=0 Apr 20 20:35:02.464528 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.464496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" event={"ID":"fae52e34-9ec8-4ab4-a941-23177da25dd6","Type":"ContainerDied","Data":"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be"} Apr 20 20:35:02.464528 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.464510 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" Apr 20 20:35:02.464528 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.464526 2576 scope.go:117] "RemoveContainer" containerID="e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be" Apr 20 20:35:02.464839 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.464516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2" event={"ID":"fae52e34-9ec8-4ab4-a941-23177da25dd6","Type":"ContainerDied","Data":"fad3d97a93d2f9caea38da271ee0f9a3fca617bedba372558494ed3a2f1782f4"} Apr 20 20:35:02.479864 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.479680 2576 scope.go:117] "RemoveContainer" containerID="e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be" Apr 20 20:35:02.480125 ip-10-0-129-247 kubenswrapper[2576]: E0420 20:35:02.479972 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be\": container with ID starting with e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be not found: ID does not exist" containerID="e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be" Apr 20 20:35:02.480125 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.480006 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be"} err="failed to get container status \"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be\": rpc error: code = NotFound desc = could not find container \"e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be\": container with ID starting with e0e7dcda8747ebc556548c03acb46f095e2fe01448ea5f2cd42ee8f2c568d2be not found: ID does not exist" Apr 20 20:35:02.489423 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.489402 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:35:02.494765 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.494720 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mttk2"] Apr 20 20:35:02.786388 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:35:02.786359 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae52e34-9ec8-4ab4-a941-23177da25dd6" path="/var/lib/kubelet/pods/fae52e34-9ec8-4ab4-a941-23177da25dd6/volumes" Apr 20 20:36:07.543943 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.543908 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf"] Apr 20 20:36:07.544408 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.544279 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fae52e34-9ec8-4ab4-a941-23177da25dd6" containerName="manager" Apr 20 20:36:07.544408 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.544289 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae52e34-9ec8-4ab4-a941-23177da25dd6" containerName="manager" Apr 20 20:36:07.544408 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.544375 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fae52e34-9ec8-4ab4-a941-23177da25dd6" containerName="manager" Apr 20 20:36:07.547369 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.547347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.549859 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.549836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-552kj\"" Apr 20 20:36:07.558109 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.558084 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf"] Apr 20 20:36:07.647838 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.647810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0c757a-f4f6-4565-a461-e48ef8d04d94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.647971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.647853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm6lt\" (UniqueName: \"kubernetes.io/projected/ae0c757a-f4f6-4565-a461-e48ef8d04d94-kube-api-access-nm6lt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.748927 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.748902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0c757a-f4f6-4565-a461-e48ef8d04d94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.749051 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.748942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm6lt\" (UniqueName: \"kubernetes.io/projected/ae0c757a-f4f6-4565-a461-e48ef8d04d94-kube-api-access-nm6lt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.749253 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.749234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae0c757a-f4f6-4565-a461-e48ef8d04d94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.757203 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.757186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm6lt\" (UniqueName: \"kubernetes.io/projected/ae0c757a-f4f6-4565-a461-e48ef8d04d94-kube-api-access-nm6lt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x95bf\" (UID: \"ae0c757a-f4f6-4565-a461-e48ef8d04d94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.858396 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.858341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:07.988148 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.988117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf"] Apr 20 20:36:07.989535 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:36:07.989506 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae0c757a_f4f6_4565_a461_e48ef8d04d94.slice/crio-2a51dbfd1612243debf9e98589a9ed5056dd5a14399f7d9e074e3d30420d2acd WatchSource:0}: Error finding container 2a51dbfd1612243debf9e98589a9ed5056dd5a14399f7d9e074e3d30420d2acd: Status 404 returned error can't find the container with id 2a51dbfd1612243debf9e98589a9ed5056dd5a14399f7d9e074e3d30420d2acd Apr 20 20:36:07.992326 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:07.992311 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:36:08.734077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:08.734043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" event={"ID":"ae0c757a-f4f6-4565-a461-e48ef8d04d94","Type":"ContainerStarted","Data":"f97fb8de098c4c5d45535bb9dc23bf62fdf86e4a6c0168e09190e540a609622b"} Apr 20 20:36:08.734077 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:08.734079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" event={"ID":"ae0c757a-f4f6-4565-a461-e48ef8d04d94","Type":"ContainerStarted","Data":"2a51dbfd1612243debf9e98589a9ed5056dd5a14399f7d9e074e3d30420d2acd"} Apr 20 20:36:08.734493 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:08.734150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:08.755786 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:08.755715 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" podStartSLOduration=1.755703706 podStartE2EDuration="1.755703706s" podCreationTimestamp="2026-04-20 20:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:36:08.753309171 +0000 UTC m=+1484.623087804" watchObservedRunningTime="2026-04-20 20:36:08.755703706 +0000 UTC m=+1484.625482385" Apr 20 20:36:19.741109 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:19.741071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x95bf" Apr 20 20:36:24.813346 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:24.813315 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:36:24.817175 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:24.817144 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:36:24.820091 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:24.820064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:36:24.823282 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:36:24.823267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:41:24.849249 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:41:24.849219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:41:24.852542 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:41:24.852523 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:41:24.854926 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:41:24.854897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:41:24.858356 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:41:24.858341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:45:49.548723 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:49.548637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bqld6_adeb5434-be14-4496-94c9-a8d5191d38a0/manager/0.log" Apr 20 20:45:50.016423 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:50.016311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-56p4f_7370a528-f1a2-4465-bc6f-5d7195d5a29f/manager/1.log" Apr 20 20:45:50.352464 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:50.352432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-hmcsd_be55442b-c3e3-4dcf-8da3-8653a73e8571/manager/0.log" Apr 20 20:45:50.454884 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:50.454855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d6dc7_1e060f66-5e39-4bc3-9b8b-d9777603aa02/postgres/0.log" Apr 20 20:45:51.219174 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.219133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/pull/0.log" Apr 20 20:45:51.224641 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.224599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/extract/0.log" Apr 20 20:45:51.229371 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.229353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/util/0.log" Apr 20 20:45:51.347866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.347838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/util/0.log" Apr 20 20:45:51.352452 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.352436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/pull/0.log" Apr 20 20:45:51.356850 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.356833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/extract/0.log" Apr 20 20:45:51.463721 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.463700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/extract/0.log" Apr 20 20:45:51.468179 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.468159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/util/0.log" Apr 20 20:45:51.472789 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.472724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/pull/0.log" Apr 20 20:45:51.576716 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.576686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/util/0.log" Apr 20 20:45:51.581772 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.581725 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/pull/0.log" Apr 20 20:45:51.586252 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.586232 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/extract/0.log" Apr 20 20:45:51.914481 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:51.914454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2fn97_9f9c407d-4549-4dca-8ef1-7fabaf3b452b/manager/0.log" Apr 20 20:45:52.241675 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:52.241579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-x95bf_ae0c757a-f4f6-4565-a461-e48ef8d04d94/manager/0.log" Apr 20 20:45:52.888524 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:52.888490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2mmxb_4d268fd0-1f66-44c4-a826-361d4dc917c0/discovery/0.log" Apr 20 20:45:53.108504 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.108481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-66df9c9b9f-znl5h_c0b4504c-7479-45ed-936a-ac633867be44/kube-auth-proxy/0.log" Apr 20 20:45:53.222007 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.221940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-p2bbn_e7496337-dfbb-45af-938b-602bc0673098/istio-proxy/0.log" Apr 20 20:45:53.323798 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.323774 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55686795d4-xz2td_cea479a5-44c4-448f-8f4e-3e373aa915f3/router/0.log" Apr 20 20:45:53.637942 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.637919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh_1fac39e3-3529-4a24-ba04-358e08f201a4/storage-initializer/0.log" Apr 20 20:45:53.644626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.644608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-c4vfh_1fac39e3-3529-4a24-ba04-358e08f201a4/main/0.log" Apr 20 20:45:53.743976 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.743957 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz_bcf372b0-470c-4bad-a0db-eb3d9e627993/storage-initializer/0.log" Apr 20 20:45:53.749510 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.749494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gmwvz_bcf372b0-470c-4bad-a0db-eb3d9e627993/main/0.log" Apr 20 20:45:53.857642 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.857623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hv8x5_2099d162-44e0-4f93-8d46-42982261e0ef/storage-initializer/0.log" Apr 20 20:45:53.862690 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:53.862675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hv8x5_2099d162-44e0-4f93-8d46-42982261e0ef/main/0.log" Apr 20 20:45:54.194404 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:54.194377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh_bc859598-9366-4fdf-a5e7-0d0705a14680/storage-initializer/0.log" Apr 20 20:45:54.201162 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:45:54.201141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-6m7mh_bc859598-9366-4fdf-a5e7-0d0705a14680/main/0.log" Apr 20 20:46:00.621645 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:00.621621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wjlsd_a126d196-9791-4180-8ef6-3408f36fa528/global-pull-secret-syncer/0.log" Apr 20 20:46:00.707458 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:00.707419 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lptm4_ff5c2154-884b-4487-9566-aadfee1d17e4/konnectivity-agent/0.log" Apr 20 20:46:00.763318 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:00.763280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-247.ec2.internal_e535b6aadad4bcde1b5b507c970d0c4f/haproxy/0.log" Apr 20 20:46:04.615873 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.615841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/extract/0.log" Apr 20 20:46:04.640892 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.640862 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/util/0.log" Apr 20 20:46:04.660519 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.660494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75966z5f_c92c179e-7812-4572-8b15-32cab54315cc/pull/0.log" Apr 20 20:46:04.688866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.688844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/extract/0.log" Apr 20 20:46:04.714103 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.714078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/util/0.log" Apr 20 20:46:04.732783 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.732765 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e06vkrl_d36d5a16-a21c-4243-b5bd-c92a268d4972/pull/0.log" Apr 20 20:46:04.757294 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.757272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/extract/0.log" Apr 20 20:46:04.775591 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.775570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/util/0.log" Apr 20 20:46:04.794785 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.794763 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73c98vc_442dbaa6-f6d0-4f7a-8b7f-76034bdc8733/pull/0.log" Apr 20 20:46:04.819201 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.819180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/extract/0.log" Apr 20 20:46:04.839068 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.839044 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/util/0.log" Apr 20 20:46:04.858281 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:04.858266 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rhl6k_b663e802-1dfb-42e4-85a6-7ccdd15d1ede/pull/0.log" Apr 20 20:46:05.228147 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:05.228121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2fn97_9f9c407d-4549-4dca-8ef1-7fabaf3b452b/manager/0.log" Apr 20 20:46:05.349601 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:05.349572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-x95bf_ae0c757a-f4f6-4565-a461-e48ef8d04d94/manager/0.log" Apr 20 20:46:07.066559 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.066518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4qc4j_9c9454ed-ac56-495e-8da5-5f99c3919333/cluster-monitoring-operator/0.log" Apr 20 20:46:07.480287 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.480184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tc7w5_2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9/node-exporter/0.log" Apr 20 20:46:07.502885 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.502855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tc7w5_2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9/kube-rbac-proxy/0.log" Apr 20 20:46:07.532211 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.532177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tc7w5_2a9255d8-9ff4-4cf9-840c-1d8a19b15ed9/init-textfile/0.log" Apr 20 20:46:07.566156 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.566131 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b8xbx_e142d0de-4827-4c18-a0da-8d75a43bb5a1/kube-rbac-proxy-main/0.log" Apr 20 20:46:07.599683 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.599666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b8xbx_e142d0de-4827-4c18-a0da-8d75a43bb5a1/kube-rbac-proxy-self/0.log" Apr 20 20:46:07.632626 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.632608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b8xbx_e142d0de-4827-4c18-a0da-8d75a43bb5a1/openshift-state-metrics/0.log" Apr 20 20:46:07.947141 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.947107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-tmzg4_9148d960-a444-4dba-a7d7-10703c00c120/prometheus-operator/0.log" Apr 20 20:46:07.968078 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.968056 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-tmzg4_9148d960-a444-4dba-a7d7-10703c00c120/kube-rbac-proxy/0.log" Apr 20 20:46:07.989903 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:07.989880 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2n7mw_1e85e520-969d-4652-b000-05191ade92fc/prometheus-operator-admission-webhook/0.log" Apr 20 20:46:09.293918 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.293887 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd"] Apr 20 20:46:09.297724 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.297705 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.300675 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.300653 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"openshift-service-ca.crt\"" Apr 20 20:46:09.302119 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.302100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"kube-root-ca.crt\"" Apr 20 20:46:09.302119 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.302101 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wj6w5\"/\"default-dockercfg-n67tm\"" Apr 20 20:46:09.306476 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.306453 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd"] Apr 20 20:46:09.366667 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.366640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-podres\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.366812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.366679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-proc\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.366812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.366700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-sys\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.366812 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.366774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfm7\" (UniqueName: \"kubernetes.io/projected/4a6d8594-5585-4419-808c-807e75e25e78-kube-api-access-crfm7\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.366942 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.366825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-lib-modules\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.415563 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.415535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-g897h_e41d9a3d-07e1-4603-8db1-ef454b3aa769/networking-console-plugin/0.log" Apr 20 20:46:09.467971 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.467948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-proc\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.467982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-sys\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crfm7\" (UniqueName: \"kubernetes.io/projected/4a6d8594-5585-4419-808c-807e75e25e78-kube-api-access-crfm7\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-lib-modules\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468081 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-proc\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-sys\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-podres\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468299 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-lib-modules\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.468409 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.468298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4a6d8594-5585-4419-808c-807e75e25e78-podres\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.481117 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.481094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfm7\" (UniqueName: \"kubernetes.io/projected/4a6d8594-5585-4419-808c-807e75e25e78-kube-api-access-crfm7\") pod \"perf-node-gather-daemonset-t4dbd\" (UID: \"4a6d8594-5585-4419-808c-807e75e25e78\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.609655 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.609597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:09.743893 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.743871 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd"] Apr 20 20:46:09.745703 ip-10-0-129-247 kubenswrapper[2576]: W0420 20:46:09.745676 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4a6d8594_5585_4419_808c_807e75e25e78.slice/crio-fd027b291469e7b729ff1f60b3944e8a9d45040063ce9e4827c07623261a1899 WatchSource:0}: Error finding container fd027b291469e7b729ff1f60b3944e8a9d45040063ce9e4827c07623261a1899: Status 404 returned error can't find the container with id fd027b291469e7b729ff1f60b3944e8a9d45040063ce9e4827c07623261a1899 Apr 20 20:46:09.747326 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.747309 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:46:09.994570 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.994494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:46:09.999632 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:09.999610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/2.log" Apr 20 20:46:10.184682 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.184639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" event={"ID":"4a6d8594-5585-4419-808c-807e75e25e78","Type":"ContainerStarted","Data":"2ec4addc69a2904ab1b1138ab10abdf71f0a0b0da58ff9dd0d603ea49348fca9"} Apr 20 20:46:10.184682 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.184675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" event={"ID":"4a6d8594-5585-4419-808c-807e75e25e78","Type":"ContainerStarted","Data":"fd027b291469e7b729ff1f60b3944e8a9d45040063ce9e4827c07623261a1899"} Apr 20 20:46:10.184911 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.184779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:10.203653 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.203607 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" podStartSLOduration=1.203592705 podStartE2EDuration="1.203592705s" podCreationTimestamp="2026-04-20 20:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:46:10.201532181 +0000 UTC m=+2086.071310810" watchObservedRunningTime="2026-04-20 20:46:10.203592705 +0000 UTC m=+2086.073371337" Apr 20 20:46:10.444210 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.444182 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76d5b6554f-6cnkf_53b7bb3c-379a-458b-bbec-0c2d75a804bd/console/0.log" Apr 20 20:46:10.471225 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.471203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lqqbc_2b6fdc39-f579-4589-9eb1-5edb0e8bf56b/download-server/0.log" Apr 20 20:46:10.939446 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:10.939414 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-c65sl_7f1825d7-23bd-454f-9e8e-d0275aa2f4f9/volume-data-source-validator/0.log" Apr 20 20:46:11.672956 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:11.672930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gr79r_75a04807-4a1d-4a9f-9f26-b677e822247a/dns/0.log" Apr 20 20:46:11.691613 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:11.691588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gr79r_75a04807-4a1d-4a9f-9f26-b677e822247a/kube-rbac-proxy/0.log" Apr 20 20:46:11.761190 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:11.761170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rwqjv_f165a6a5-16a4-48c1-8e9a-9819f7939466/dns-node-resolver/0.log" Apr 20 20:46:12.220330 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:12.220306 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hgbbc_2f856405-928f-4e0f-a6a4-b56a19061640/node-ca/0.log" Apr 20 20:46:13.125656 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:13.125620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2mmxb_4d268fd0-1f66-44c4-a826-361d4dc917c0/discovery/0.log" Apr 20 20:46:13.167671 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:13.167582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-66df9c9b9f-znl5h_c0b4504c-7479-45ed-936a-ac633867be44/kube-auth-proxy/0.log" Apr 20 20:46:13.221567 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:13.221543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-p2bbn_e7496337-dfbb-45af-938b-602bc0673098/istio-proxy/0.log" Apr 20 20:46:13.239576 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:13.239550 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55686795d4-xz2td_cea479a5-44c4-448f-8f4e-3e373aa915f3/router/0.log" Apr 20 20:46:13.703267 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:13.703237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5c5mw_ae28c5cd-450f-42d0-a36c-1e045e920a41/serve-healthcheck-canary/0.log" Apr 20 20:46:14.218829 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:14.218801 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dnc6s_01f326e1-f137-4977-9c82-2d9ca1b42e9d/kube-rbac-proxy/0.log" Apr 20 20:46:14.235930 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:14.235907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dnc6s_01f326e1-f137-4977-9c82-2d9ca1b42e9d/exporter/0.log" Apr 20 20:46:14.254237 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:14.254173 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dnc6s_01f326e1-f137-4977-9c82-2d9ca1b42e9d/extractor/0.log" Apr 20 20:46:16.199913 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.199884 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-t4dbd" Apr 20 20:46:16.285505 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.285480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bqld6_adeb5434-be14-4496-94c9-a8d5191d38a0/manager/0.log" Apr 20 20:46:16.420717 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.420688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-56p4f_7370a528-f1a2-4465-bc6f-5d7195d5a29f/manager/0.log" Apr 20 20:46:16.431268 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.431249 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-56p4f_7370a528-f1a2-4465-bc6f-5d7195d5a29f/manager/1.log" Apr 20 20:46:16.516962 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.516904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-hmcsd_be55442b-c3e3-4dcf-8da3-8653a73e8571/manager/0.log" Apr 20 20:46:16.535725 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:16.535701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d6dc7_1e060f66-5e39-4bc3-9b8b-d9777603aa02/postgres/0.log" Apr 20 20:46:22.415820 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:22.415788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s8wdf_cf54aa0b-6d6e-447c-9260-19fd40703cdc/kube-storage-version-migrator-operator/1.log" Apr 20 20:46:22.416718 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:22.416694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s8wdf_cf54aa0b-6d6e-447c-9260-19fd40703cdc/kube-storage-version-migrator-operator/0.log" Apr 20 20:46:23.353384 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.353356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/kube-multus-additional-cni-plugins/0.log" Apr 20 20:46:23.371866 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.371837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/egress-router-binary-copy/0.log" Apr 20 20:46:23.389382 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.389364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/cni-plugins/0.log" Apr 20 20:46:23.410171 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.410148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/bond-cni-plugin/0.log" Apr 20 20:46:23.428633 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.428605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/routeoverride-cni/0.log" Apr 20 20:46:23.447530 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.447510 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/whereabouts-cni-bincopy/0.log" Apr 20 20:46:23.464342 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.464322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lhxt_a9aaf97f-201c-4eed-b64e-a78004674964/whereabouts-cni/0.log" Apr 20 20:46:23.829302 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.829273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nbfv2_7ff1077e-9f9b-4382-9161-a7e4d8da5193/kube-multus/0.log" Apr 20 20:46:23.847372 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.847348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c89q8_62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2/network-metrics-daemon/0.log" Apr 20 20:46:23.862988 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:23.862965 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c89q8_62a073ac-fa6f-4ebe-a0d3-7ab63d9c9de2/kube-rbac-proxy/0.log" Apr 20 20:46:24.741488 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.741461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-controller/0.log" Apr 20 20:46:24.755310 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.755287 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:46:24.764941 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.764904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/1.log" Apr 20 20:46:24.780891 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.780873 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/kube-rbac-proxy-node/0.log" Apr 20 20:46:24.799380 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.799360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:46:24.817427 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.817411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/northd/0.log" Apr 20 20:46:24.835048 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.835031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/nbdb/0.log" Apr 20 20:46:24.852004 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.851984 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/sbdb/0.log" Apr 20 20:46:24.890655 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.890625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:46:24.894619 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.894598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:46:24.900178 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.900154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6dwd9_f8ab594e-8a42-49c7-bbb9-e82d95d72ee3/console-operator/1.log" Apr 20 20:46:24.908087 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.908069 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovn-acl-logging/0.log" Apr 20 20:46:24.963777 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:24.963727 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jq6pm_bd7aaee9-0fd7-4edc-b3a6-ea8cc2c8856f/ovnkube-controller/0.log" Apr 20 20:46:26.582424 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:26.582399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-pts4j_448c06af-22c3-45bf-8cf3-2a424611877b/check-endpoints/0.log" Apr 20 20:46:26.651491 ip-10-0-129-247 kubenswrapper[2576]: I0420 20:46:26.651466 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qcvnc_1f86d95f-49a9-4d51-afd5-7dc67dfd0cd7/network-check-target-container/0.log"