Apr 21 14:25:35.231407 ip-10-0-138-110 systemd[1]: Starting Kubernetes Kubelet... Apr 21 14:25:35.597127 ip-10-0-138-110 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:35.597127 ip-10-0-138-110 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 14:25:35.597127 ip-10-0-138-110 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:35.597127 ip-10-0-138-110 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 14:25:35.597127 ip-10-0-138-110 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:35.599279 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.599188 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 14:25:35.601515 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601499 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:35.601515 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601515 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601519 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601523 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601527 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601529 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601532 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601535 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601538 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601540 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601543 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601546 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601549 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601551 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601554 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601557 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601565 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601568 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601571 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601574 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601576 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:35.601575 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601580 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601582 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601585 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601588 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601591 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601594 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601596 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601599 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601603 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601607 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601610 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601613 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601616 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601619 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601622 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601625 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601628 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601630 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601635 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:35.602046 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601639 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601642 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601644 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601647 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601650 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601652 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601655 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601658 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601661 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601664 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601666 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601669 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601671 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601674 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601676 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601679 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601682 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601685 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601688 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:35.602811 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601690 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601693 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601695 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601698 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601703 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601708 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601712 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601715 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601718 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601720 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601723 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601726 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601729 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601732 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601734 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601737 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601740 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601742 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601745 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601749 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:35.603392 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601752 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601754 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601757 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601760 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601763 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601765 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.601768 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603577 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603590 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603595 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603598 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603601 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603604 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603607 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603610 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603613 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603616 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603619 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603621 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603624 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:35.603888 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603627 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603630 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603632 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603635 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603638 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603640 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603643 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603645 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603648 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603651 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603653 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603656 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603658 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603661 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603664 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603667 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603670 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603673 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603675 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603678 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:35.604415 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603680 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603683 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603685 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603688 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603690 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603693 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603695 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603697 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603701 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603703 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603705 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603708 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603710 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603713 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603715 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603718 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603720 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603723 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603725 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603728 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:35.604923 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603730 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603733 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603735 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603738 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603741 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603745 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603749 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603752 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603755 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603758 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603761 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603763 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603765 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603768 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603770 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603773 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603776 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603778 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603781 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603783 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:35.605469 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603786 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603788 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603791 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603793 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603795 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603799 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603801 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603805 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603807 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603809 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603814 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603818 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.603821 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605019 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605030 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605038 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605043 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605048 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605052 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605057 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605062 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 14:25:35.605966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605065 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605069 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605072 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605076 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605079 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605082 2576 flags.go:64] FLAG: --cgroup-root="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605085 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605088 2576 flags.go:64] FLAG: --client-ca-file="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605091 2576 flags.go:64] FLAG: --cloud-config="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605094 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605097 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605103 2576 flags.go:64] FLAG: --cluster-domain="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605106 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605122 2576 flags.go:64] FLAG: --config-dir="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605125 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605128 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605132 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605136 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605140 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605143 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605146 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605149 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605152 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605155 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605158 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 14:25:35.606508 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605163 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605166 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605169 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605172 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605176 2576 flags.go:64] FLAG: --enable-server="true" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605179 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605184 2576 flags.go:64] FLAG: --event-burst="100" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605187 2576 flags.go:64] FLAG: --event-qps="50" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605190 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605193 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605196 2576 flags.go:64] FLAG: --eviction-hard="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605200 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605203 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605206 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605209 2576 flags.go:64] FLAG: --eviction-soft="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605212 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605214 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605217 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605220 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605223 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605226 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605228 2576 flags.go:64] FLAG: --feature-gates="" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605232 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605235 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605238 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 14:25:35.607139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605242 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605245 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605248 2576 flags.go:64] FLAG: --help="false" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605251 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605255 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605258 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605260 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605265 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605268 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605271 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605275 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605278 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605281 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605284 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605288 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605291 2576 flags.go:64] FLAG: --kube-reserved="" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605294 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605297 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605300 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605303 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605306 2576 flags.go:64] FLAG: --lock-file="" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605309 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605312 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605315 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 14:25:35.607742 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605321 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605324 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605327 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605330 2576 flags.go:64] FLAG: --logging-format="text" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605333 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605336 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605340 2576 flags.go:64] FLAG: --manifest-url="" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605343 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605347 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605351 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605355 2576 flags.go:64] FLAG: --max-pods="110" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605358 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605361 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605364 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605367 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605370 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605373 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605376 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605384 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605387 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605390 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605393 2576 flags.go:64] FLAG: --pod-cidr="" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605396 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605400 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 14:25:35.608347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605403 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605407 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605410 2576 flags.go:64] FLAG: --port="10250" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605413 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605416 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00973dac47d0815c2" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605419 2576 flags.go:64] FLAG: --qos-reserved="" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605422 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605425 2576 flags.go:64] FLAG: --register-node="true" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605428 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605431 2576 flags.go:64] FLAG: --register-with-taints="" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605434 2576 flags.go:64] FLAG: --registry-burst="10" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605437 2576 flags.go:64] FLAG: --registry-qps="5" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605440 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605443 2576 flags.go:64] FLAG: --reserved-memory="" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605447 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605450 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605453 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605456 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605461 2576 flags.go:64] FLAG: --runonce="false" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605464 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605468 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605471 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605474 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605477 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605481 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605484 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 14:25:35.609027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605487 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605490 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605493 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605496 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605499 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605502 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605505 2576 flags.go:64] FLAG: --system-cgroups="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605508 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605514 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605516 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605519 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605524 2576 flags.go:64] FLAG: --tls-min-version="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605527 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605530 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605533 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605535 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605538 2576 flags.go:64] FLAG: --v="2" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605545 2576 flags.go:64] FLAG: --version="false" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605550 2576 flags.go:64] FLAG: --vmodule="" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605554 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.605557 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605654 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605657 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605660 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:35.609679 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605663 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605667 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605670 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605672 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605675 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605678 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605680 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605683 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605685 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605688 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605691 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605693 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605696 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605698 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605701 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605704 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605707 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605709 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605712 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605715 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:35.610270 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605717 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605720 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605723 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605725 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605728 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605733 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605735 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605738 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605741 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605745 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605748 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605750 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605755 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605757 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605760 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605763 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605766 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605771 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605774 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:35.610802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605776 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605779 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605782 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605784 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605787 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605789 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605792 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605794 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605797 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605800 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605802 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605805 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605807 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605810 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605813 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605815 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605818 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605820 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605824 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605827 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:35.611286 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605829 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605832 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605835 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605837 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605840 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605843 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605846 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605849 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605852 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605855 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605857 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605860 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605863 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605865 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605867 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605870 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605873 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605875 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605878 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605880 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:35.612130 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605883 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:35.612977 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605886 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:35.612977 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605888 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:35.612977 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.605891 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:35.612977 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.606497 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:35.613395 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.613373 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 14:25:35.613464 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.613396 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613466 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613474 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613479 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613484 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613489 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613494 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613498 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613503 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613509 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:35.613512 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613514 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613518 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613522 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613527 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613531 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613535 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613540 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613544 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613548 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613552 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613556 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613562 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613568 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613573 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613577 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613582 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613586 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613590 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613594 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:35.613942 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613598 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613603 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613607 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613612 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613617 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613621 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613625 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613629 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613633 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613637 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613642 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613647 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613651 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613655 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613659 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613665 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613669 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613673 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613678 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613682 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:35.614759 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613686 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613691 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613694 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613698 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613703 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613707 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613711 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613716 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613720 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613724 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613728 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613732 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613736 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613740 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613745 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613751 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613758 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613762 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613766 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613770 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:35.615580 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613774 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613778 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613783 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613787 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613791 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613796 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613800 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613805 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613810 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613814 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613818 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613823 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613827 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613832 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613836 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613840 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613844 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:35.616439 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.613849 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.613857 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614026 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614037 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614042 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614046 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614051 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614055 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614060 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614064 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614068 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614072 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614077 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614081 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614087 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614093 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:35.617020 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614097 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614104 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614125 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614131 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614135 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614140 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614145 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614150 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614155 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614159 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614165 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614169 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614173 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614178 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614182 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614186 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614191 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614195 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614199 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:35.617595 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614203 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614207 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614211 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614215 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614220 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614224 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614228 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614232 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614236 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614240 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614245 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614249 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614254 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614257 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614261 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614266 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614270 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614274 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614277 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614282 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:35.618069 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614287 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614291 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614295 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614299 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614303 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614308 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614312 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614317 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614321 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614325 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614329 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614333 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614337 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614342 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614346 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614350 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614354 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614358 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614362 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614367 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:35.618636 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614371 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614375 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614379 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614384 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614389 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614393 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614397 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614401 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614405 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614409 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614414 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614418 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:35.614422 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.614431 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.614633 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 14:25:35.619189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.617331 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 14:25:35.619566 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.618186 2576 server.go:1019] "Starting client certificate rotation" Apr 21 14:25:35.619566 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.618279 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:25:35.619566 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.618993 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:25:35.638917 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.638890 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:25:35.641314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.641240 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:25:35.656909 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.656888 2576 log.go:25] "Validated CRI v1 runtime API" Apr 21 14:25:35.662548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.662530 2576 log.go:25] "Validated CRI v1 image API" Apr 21 14:25:35.663762 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.663746 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 14:25:35.667062 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.667042 2576 fs.go:135] Filesystem UUIDs: map[6f394943-81a0-4311-b06f-73b933210e51:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ba291075-69e7-445d-81cd-e198654ee90b:/dev/nvme0n1p4] Apr 21 14:25:35.667140 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.667061 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 14:25:35.667758 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.667741 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:25:35.672907 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.672800 2576 manager.go:217] Machine: {Timestamp:2026-04-21 14:25:35.67103274 +0000 UTC m=+0.337053373 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100172 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2684e4d301bea92991dd8e50fe2b8b SystemUUID:ec2684e4-d301-bea9-2991-dd8e50fe2b8b BootID:4c78446e-ca47-449b-a5da-47e529139939 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:41:4b:63:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:41:4b:63:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:be:bf:31:70:10 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 14:25:35.672907 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.672903 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 14:25:35.673031 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.672986 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 14:25:35.673991 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.673966 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 14:25:35.674151 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.673992 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-110.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 14:25:35.674201 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.674159 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 14:25:35.674201 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.674169 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 14:25:35.674201 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.674183 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:25:35.675415 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.675404 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:25:35.676741 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.676730 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:25:35.677013 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.677003 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 14:25:35.678890 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.678880 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 21 14:25:35.678939 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.678894 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 14:25:35.678939 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.678912 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 14:25:35.678939 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.678922 2576 kubelet.go:397] "Adding apiserver pod source" Apr 21 14:25:35.678939 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.678933 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 14:25:35.679982 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.679968 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:25:35.680030 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.679990 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:25:35.682449 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.682435 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 14:25:35.683901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.683887 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 14:25:35.685831 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685818 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 14:25:35.685867 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685839 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 14:25:35.685867 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685848 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 14:25:35.685867 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685856 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 14:25:35.685867 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685888 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685897 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685905 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685916 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685924 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685948 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 14:25:35.685988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.685961 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 14:25:35.686646 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.686634 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 14:25:35.686676 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.686648 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 14:25:35.690495 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.690481 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 14:25:35.690574 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.690520 2576 server.go:1295] "Started kubelet" Apr 21 14:25:35.690625 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.690602 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 14:25:35.690724 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.690680 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 14:25:35.690771 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.690746 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 14:25:35.691165 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.691149 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-110.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 14:25:35.691229 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.691169 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 14:25:35.691287 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.691269 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 14:25:35.691502 ip-10-0-138-110 systemd[1]: Started Kubernetes Kubelet. Apr 21 14:25:35.691874 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.691858 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 14:25:35.693692 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.693673 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 21 14:25:35.697822 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.697023 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-110.ec2.internal.18a86564964e0f56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-110.ec2.internal,UID:ip-10-0-138-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-110.ec2.internal,},FirstTimestamp:2026-04-21 14:25:35.690493782 +0000 UTC m=+0.356514415,LastTimestamp:2026-04-21 14:25:35.690493782 +0000 UTC m=+0.356514415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-110.ec2.internal,}" Apr 21 14:25:35.698718 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.698698 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 14:25:35.698832 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.698812 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-74gvr" Apr 21 14:25:35.699176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.699156 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 14:25:35.699847 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.699822 2576 factory.go:55] Registering systemd factory Apr 21 14:25:35.699847 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.699851 2576 factory.go:223] Registration of the systemd container factory successfully Apr 21 14:25:35.699979 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.699902 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 14:25:35.699979 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.699917 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 14:25:35.700076 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700013 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 14:25:35.700076 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.700032 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:35.700187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700092 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 21 14:25:35.700187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700099 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 21 14:25:35.700187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700099 2576 factory.go:153] Registering CRI-O factory Apr 21 14:25:35.700187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700130 2576 factory.go:223] Registration of the crio container factory successfully Apr 21 14:25:35.700187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700180 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 14:25:35.700428 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700204 2576 factory.go:103] Registering Raw factory Apr 21 14:25:35.700428 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700219 2576 manager.go:1196] Started watching for new ooms in manager Apr 21 14:25:35.700523 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.700443 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 14:25:35.700731 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.700706 2576 manager.go:319] Starting recovery of all containers Apr 21 14:25:35.703702 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.703677 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 14:25:35.703812 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.703746 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 14:25:35.706935 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.706745 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 14:25:35.707462 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.707438 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-74gvr" Apr 21 14:25:35.711554 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.711534 2576 manager.go:324] Recovery completed Apr 21 14:25:35.716296 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.716283 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.718600 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.718586 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.718663 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.718617 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.718663 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.718629 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.719161 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.719145 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 14:25:35.719210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.719162 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 14:25:35.719210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.719181 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:25:35.720608 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.720543 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-110.ec2.internal.18a8656497faecd6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-110.ec2.internal,UID:ip-10-0-138-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-110.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-110.ec2.internal,},FirstTimestamp:2026-04-21 14:25:35.718599894 +0000 UTC m=+0.384620527,LastTimestamp:2026-04-21 14:25:35.718599894 +0000 UTC m=+0.384620527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-110.ec2.internal,}" Apr 21 14:25:35.722408 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.722393 2576 policy_none.go:49] "None policy: Start" Apr 21 14:25:35.722481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.722412 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 14:25:35.722481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.722438 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.767663 2576 manager.go:341] "Starting Device Plugin manager" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.767693 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.767703 2576 server.go:85] "Starting device plugin registration server" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.767964 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.767975 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.768060 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.768155 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.768164 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.768688 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 14:25:35.774176 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.768728 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:35.790416 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.790397 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 14:25:35.790520 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.790430 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 14:25:35.790520 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.790448 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 14:25:35.790520 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.790454 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 14:25:35.790520 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.790484 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 14:25:35.793741 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.793719 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:35.868176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.868055 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.870482 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.870438 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.870590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.870502 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.870590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.870516 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.870590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.870552 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.878975 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.878955 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.878975 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.878976 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-110.ec2.internal\": node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:35.891373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.891343 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal"] Apr 21 14:25:35.891470 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.891410 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.892220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.892203 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.892303 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.892238 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.892303 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.892252 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.894740 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.894725 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.894890 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.894875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.894937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.894904 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.895389 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895371 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.895389 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895385 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.895515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895405 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.895515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895419 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.895515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895405 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.895515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.895485 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.896060 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.896042 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:35.897812 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.897798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.897877 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.897822 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:35.898571 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.898552 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:35.898640 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.898576 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:35.898640 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.898585 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:35.901087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.901020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c841ec6859af6aa2f42f3eb0143c101-config\") pod \"kube-apiserver-proxy-ip-10-0-138-110.ec2.internal\" (UID: \"1c841ec6859af6aa2f42f3eb0143c101\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.901087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.901063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.901234 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:35.901090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.920896 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.920874 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-110.ec2.internal\" not found" node="ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.925225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.925209 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-110.ec2.internal\" not found" node="ip-10-0-138-110.ec2.internal" Apr 21 14:25:35.997187 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:35.997159 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.001323 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c841ec6859af6aa2f42f3eb0143c101-config\") pod \"kube-apiserver-proxy-ip-10-0-138-110.ec2.internal\" (UID: \"1c841ec6859af6aa2f42f3eb0143c101\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.001419 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.001419 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.001419 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.001562 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c841ec6859af6aa2f42f3eb0143c101-config\") pod \"kube-apiserver-proxy-ip-10-0-138-110.ec2.internal\" (UID: \"1c841ec6859af6aa2f42f3eb0143c101\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.001562 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.001417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/077d02c6a0c7251870148fbde868cd7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal\" (UID: \"077d02c6a0c7251870148fbde868cd7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.098195 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.098152 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.198914 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.198872 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.222316 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.222292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.227885 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.227866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:36.299277 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.299237 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.399744 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.399705 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.500232 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.500145 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.600749 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.600716 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.618036 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.618016 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 14:25:36.618210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.618190 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:25:36.699003 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.698973 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 14:25:36.701510 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.701485 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.709499 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.709472 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 14:20:35 +0000 UTC" deadline="2027-12-29 05:30:02.729738608 +0000 UTC" Apr 21 14:25:36.709499 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.709496 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14799h4m26.02024552s" Apr 21 14:25:36.709656 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.709573 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:25:36.719376 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.719357 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:36.734515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.734496 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wrp9f" Apr 21 14:25:36.742695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.742675 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wrp9f" Apr 21 14:25:36.782568 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:36.782530 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077d02c6a0c7251870148fbde868cd7c.slice/crio-33c00879b7556dd3fc1e568191a91f76bfcf8b09908a452a1cbd9bc89795fcae WatchSource:0}: Error finding container 33c00879b7556dd3fc1e568191a91f76bfcf8b09908a452a1cbd9bc89795fcae: Status 404 returned error can't find the container with id 33c00879b7556dd3fc1e568191a91f76bfcf8b09908a452a1cbd9bc89795fcae Apr 21 14:25:36.782872 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:36.782855 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c841ec6859af6aa2f42f3eb0143c101.slice/crio-9a1f7743dca64c73c508bfb24ebf7cde2baaedb8243e4889ba2f80182321f870 WatchSource:0}: Error finding container 9a1f7743dca64c73c508bfb24ebf7cde2baaedb8243e4889ba2f80182321f870: Status 404 returned error can't find the container with id 9a1f7743dca64c73c508bfb24ebf7cde2baaedb8243e4889ba2f80182321f870 Apr 21 14:25:36.787578 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.787562 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:25:36.792834 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.792797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" event={"ID":"077d02c6a0c7251870148fbde868cd7c","Type":"ContainerStarted","Data":"33c00879b7556dd3fc1e568191a91f76bfcf8b09908a452a1cbd9bc89795fcae"} Apr 21 14:25:36.793735 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.793714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" event={"ID":"1c841ec6859af6aa2f42f3eb0143c101","Type":"ContainerStarted","Data":"9a1f7743dca64c73c508bfb24ebf7cde2baaedb8243e4889ba2f80182321f870"} Apr 21 14:25:36.802043 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.802023 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.902531 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:36.902500 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:36.979537 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:36.979506 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:37.002779 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.002754 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:37.103305 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.103227 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:37.204076 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.204040 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-110.ec2.internal\" not found" Apr 21 14:25:37.274710 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.274678 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:37.299880 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.299849 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" Apr 21 14:25:37.327059 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.327029 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:25:37.327906 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.327885 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" Apr 21 14:25:37.341907 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.341878 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:25:37.527037 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.527004 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:37.679956 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.679921 2576 apiserver.go:52] "Watching apiserver" Apr 21 14:25:37.686578 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.686554 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 14:25:37.689050 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.689012 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4b46l","openshift-cluster-node-tuning-operator/tuned-fnk98","openshift-dns/node-resolver-5bc2h","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal","openshift-multus/multus-additional-cni-plugins-9hpmv","openshift-network-diagnostics/network-check-target-64qpt","openshift-ovn-kubernetes/ovnkube-node-tn95v","kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f","openshift-image-registry/node-ca-nwm4s","openshift-multus/multus-hgpkp","openshift-multus/network-metrics-daemon-5bc88","openshift-network-operator/iptables-alerter-v7x72"] Apr 21 14:25:37.691782 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.691752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.693823 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.693798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.694981 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.694951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 14:25:37.695084 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.694962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 14:25:37.695193 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.695175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 14:25:37.695246 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.695180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.695758 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.695739 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.695988 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.695970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-66n5s\"" Apr 21 14:25:37.696089 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.696072 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.696178 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.696084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.696178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.696150 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:37.696289 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.696087 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.696289 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.696206 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fq8sr\"" Apr 21 14:25:37.696289 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.696210 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 14:25:37.698383 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.698365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.700782 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.700719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.702148 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.701751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.702148 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.701784 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bg7fp\"" Apr 21 14:25:37.702148 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.701848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 14:25:37.702148 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.701969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 14:25:37.702148 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.702101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.702551 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.702532 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 14:25:37.702804 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.702787 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-trsck\"" Apr 21 14:25:37.702973 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.702958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:37.703051 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.703022 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:37.703689 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.703590 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 14:25:37.705501 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.705483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.708034 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.708016 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.709613 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.709595 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 14:25:37.709817 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.709803 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.710045 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710031 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-t6cq2\"" Apr 21 14:25:37.710136 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-config\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710136 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-env-overrides\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710245 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-run\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710245 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-bin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.710245 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-daemon-config\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-kubelet\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710256 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-slash\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-tuned\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-ovn\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qdl\" (UniqueName: \"kubernetes.io/projected/1a11169f-b65d-4081-aab8-2e91e9d09a48-kube-api-access-78qdl\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-conf\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-host\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-etc-kubernetes\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-system-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-modprobe-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710056 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-tmp\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-os-release\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.710662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-var-lib-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-kubernetes\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lswn8\" (UniqueName: \"kubernetes.io/projected/bbebdf2e-6e19-434f-9e3e-bb9880123092-kube-api-access-lswn8\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-cnibin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.710971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-hostroot\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711034 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 14:25:37.711066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-kubelet\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-node-log\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovn-node-metrics-cert\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711385 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-cni-binary-copy\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvp5\" (UniqueName: \"kubernetes.io/projected/7144a992-05ff-4c23-81b5-3daad4c438a6-kube-api-access-tkvp5\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711411 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77bed861-9016-43ee-825c-b20388894201-agent-certs\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-lib-modules\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-netns\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77bed861-9016-43ee-825c-b20388894201-konnectivity-ca\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-log-socket\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.711649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-socket-dir-parent\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-netns\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-etc-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-multus-certs\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.711862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-bin\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysconfig\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.712314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-var-lib-kubelet\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.712314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzfs\" (UniqueName: \"kubernetes.io/projected/296b2963-2f2f-4c9f-9186-e593ae45021a-kube-api-access-zfzfs\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.712314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-systemd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-netd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712314 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-script-lib\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-systemd\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-conf-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-systemd-units\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-sys\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-k8s-cni-cncf-io\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.712539 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-multus\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.712813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.712612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z72t5\"" Apr 21 14:25:37.714226 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.714205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.714317 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.714209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.716447 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.716537 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716518 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.716602 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716588 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z4mpj\"" Apr 21 14:25:37.716918 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.716918 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716746 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gcqgm\"" Apr 21 14:25:37.716918 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716780 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 14:25:37.716918 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.716806 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 14:25:37.719442 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.719307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 14:25:37.719442 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.719317 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:25:37.719594 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.719570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vn6nz\"" Apr 21 14:25:37.719689 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.719631 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 14:25:37.743972 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.743940 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:20:36 +0000 UTC" deadline="2027-11-12 14:15:18.785240517 +0000 UTC" Apr 21 14:25:37.744071 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.743972 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13679h49m41.041271324s" Apr 21 14:25:37.801830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.801766 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 14:25:37.812966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.812939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-cnibin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.812966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-hostroot\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9148116-ff1f-4dd1-b372-36d40a8132b7-serviceca\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-sys-fs\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-cnibin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-hostroot\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-os-release\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-kubelet\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-kubelet\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-node-log\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-node-log\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovn-node-metrics-cert\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.813421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdm69\" (UniqueName: \"kubernetes.io/projected/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-kube-api-access-hdm69\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-cni-binary-copy\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvp5\" (UniqueName: \"kubernetes.io/projected/7144a992-05ff-4c23-81b5-3daad4c438a6-kube-api-access-tkvp5\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77bed861-9016-43ee-825c-b20388894201-agent-certs\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-lib-modules\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5524b629-aa78-45d5-95d5-b7a2b5f17b82-iptables-alerter-script\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmsz\" (UniqueName: \"kubernetes.io/projected/fbd47f5e-fbd3-42b6-9631-93eae13c275b-kube-api-access-vhmsz\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-netns\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77bed861-9016-43ee-825c-b20388894201-konnectivity-ca\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-log-socket\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813720 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-socket-dir-parent\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-netns\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-etc-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-multus-certs\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-bin\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysconfig\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.813982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-var-lib-kubelet\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzfs\" (UniqueName: \"kubernetes.io/projected/296b2963-2f2f-4c9f-9186-e593ae45021a-kube-api-access-zfzfs\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-systemd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-log-socket\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-cni-binary-copy\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-ovn-kubernetes\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-netd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-run-netns\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-netd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.814184 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-socket-dir-parent\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-multus-certs\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-cni-bin\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-script-lib\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.814946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-etc-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-lib-modules\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-netns\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-systemd\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-systemd\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cnibin\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-socket-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-var-lib-kubelet\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-conf-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-systemd\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-systemd-units\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysconfig\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-sys\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sdzs\" (UniqueName: \"kubernetes.io/projected/5524b629-aa78-45d5-95d5-b7a2b5f17b82-kube-api-access-7sdzs\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-systemd-units\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-conf-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-k8s-cni-cncf-io\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.815729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-sys\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.814706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:38.314663637 +0000 UTC m=+2.980684270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-run-k8s-cni-cncf-io\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-multus\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77bed861-9016-43ee-825c-b20388894201-konnectivity-ca\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-config\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-multus\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-env-overrides\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-run\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-script-lib\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv296\" (UniqueName: \"kubernetes.io/projected/a9148116-ff1f-4dd1-b372-36d40a8132b7-kube-api-access-hv296\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-device-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-bin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.814958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-run\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-system-cni-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-daemon-config\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-kubelet\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.816306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-host-var-lib-cni-bin\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-slash\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-kubelet\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-tuned\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-host-slash\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9148116-ff1f-4dd1-b372-36d40a8132b7-host\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-binary-copy\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-env-overrides\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5524b629-aa78-45d5-95d5-b7a2b5f17b82-host-slash\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-ovn\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78qdl\" (UniqueName: \"kubernetes.io/projected/1a11169f-b65d-4081-aab8-2e91e9d09a48-kube-api-access-78qdl\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-conf\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-run-ovn\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovnkube-config\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-host\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-registration-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.817053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drc4\" (UniqueName: \"kubernetes.io/projected/7f75f6e6-764d-495a-9847-751ba9625fa1-kube-api-access-2drc4\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-etc-kubernetes\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-host\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-sysctl-conf\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-system-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7144a992-05ff-4c23-81b5-3daad4c438a6-multus-daemon-config\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-etc-kubernetes\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-system-cni-dir\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-modprobe-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-tmp\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd47f5e-fbd3-42b6-9631-93eae13c275b-hosts-file\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-modprobe-d\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-os-release\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.817830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-var-lib-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.818198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a11169f-b65d-4081-aab8-2e91e9d09a48-var-lib-openvswitch\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.815761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7144a992-05ff-4c23-81b5-3daad4c438a6-os-release\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.818298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-kubernetes\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.818342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lswn8\" (UniqueName: \"kubernetes.io/projected/bbebdf2e-6e19-434f-9e3e-bb9880123092-kube-api-access-lswn8\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.818377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd47f5e-fbd3-42b6-9631-93eae13c275b-tmp-dir\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.818526 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.818489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-kubernetes\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.819035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-etc-tuned\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.819137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a11169f-b65d-4081-aab8-2e91e9d09a48-ovn-node-metrics-cert\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.819178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77bed861-9016-43ee-825c-b20388894201-agent-certs\") pod \"konnectivity-agent-4b46l\" (UID: \"77bed861-9016-43ee-825c-b20388894201\") " pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.819348 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.819368 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.819381 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:37.819494 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:37.819449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:38.319432262 +0000 UTC m=+2.985452899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:37.821927 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.821826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296b2963-2f2f-4c9f-9186-e593ae45021a-tmp\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.823252 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.823234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvp5\" (UniqueName: \"kubernetes.io/projected/7144a992-05ff-4c23-81b5-3daad4c438a6-kube-api-access-tkvp5\") pod \"multus-hgpkp\" (UID: \"7144a992-05ff-4c23-81b5-3daad4c438a6\") " pod="openshift-multus/multus-hgpkp" Apr 21 14:25:37.824733 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.824712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzfs\" (UniqueName: \"kubernetes.io/projected/296b2963-2f2f-4c9f-9186-e593ae45021a-kube-api-access-zfzfs\") pod \"tuned-fnk98\" (UID: \"296b2963-2f2f-4c9f-9186-e593ae45021a\") " pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:37.829537 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.829514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qdl\" (UniqueName: \"kubernetes.io/projected/1a11169f-b65d-4081-aab8-2e91e9d09a48-kube-api-access-78qdl\") pod \"ovnkube-node-tn95v\" (UID: \"1a11169f-b65d-4081-aab8-2e91e9d09a48\") " pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:37.829966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.829950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lswn8\" (UniqueName: \"kubernetes.io/projected/bbebdf2e-6e19-434f-9e3e-bb9880123092-kube-api-access-lswn8\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:37.919220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd47f5e-fbd3-42b6-9631-93eae13c275b-tmp-dir\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.919220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9148116-ff1f-4dd1-b372-36d40a8132b7-serviceca\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.919220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-sys-fs\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-os-release\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdm69\" (UniqueName: \"kubernetes.io/projected/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-kube-api-access-hdm69\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5524b629-aa78-45d5-95d5-b7a2b5f17b82-iptables-alerter-script\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmsz\" (UniqueName: \"kubernetes.io/projected/fbd47f5e-fbd3-42b6-9631-93eae13c275b-kube-api-access-vhmsz\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cnibin\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919509 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-socket-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sdzs\" (UniqueName: \"kubernetes.io/projected/5524b629-aa78-45d5-95d5-b7a2b5f17b82-kube-api-access-7sdzs\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv296\" (UniqueName: \"kubernetes.io/projected/a9148116-ff1f-4dd1-b372-36d40a8132b7-kube-api-access-hv296\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-device-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-sys-fs\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-system-cni-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9148116-ff1f-4dd1-b372-36d40a8132b7-host\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-binary-copy\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd47f5e-fbd3-42b6-9631-93eae13c275b-tmp-dir\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5524b629-aa78-45d5-95d5-b7a2b5f17b82-host-slash\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9148116-ff1f-4dd1-b372-36d40a8132b7-serviceca\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5524b629-aa78-45d5-95d5-b7a2b5f17b82-host-slash\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-registration-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-socket-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2drc4\" (UniqueName: \"kubernetes.io/projected/7f75f6e6-764d-495a-9847-751ba9625fa1-kube-api-access-2drc4\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-os-release\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.919945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd47f5e-fbd3-42b6-9631-93eae13c275b-hosts-file\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.919986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-registration-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-device-dir\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cnibin\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-system-cni-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9148116-ff1f-4dd1-b372-36d40a8132b7-host\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f75f6e6-764d-495a-9847-751ba9625fa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd47f5e-fbd3-42b6-9631-93eae13c275b-hosts-file\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5524b629-aa78-45d5-95d5-b7a2b5f17b82-iptables-alerter-script\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.920790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.920608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-cni-binary-copy\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.931542 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.931492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdm69\" (UniqueName: \"kubernetes.io/projected/8ad4357c-863f-4ad8-b101-d0a7bef5fa90-kube-api-access-hdm69\") pod \"multus-additional-cni-plugins-9hpmv\" (UID: \"8ad4357c-863f-4ad8-b101-d0a7bef5fa90\") " pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:37.931667 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.931553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv296\" (UniqueName: \"kubernetes.io/projected/a9148116-ff1f-4dd1-b372-36d40a8132b7-kube-api-access-hv296\") pod \"node-ca-nwm4s\" (UID: \"a9148116-ff1f-4dd1-b372-36d40a8132b7\") " pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:37.931741 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.931721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmsz\" (UniqueName: \"kubernetes.io/projected/fbd47f5e-fbd3-42b6-9631-93eae13c275b-kube-api-access-vhmsz\") pod \"node-resolver-5bc2h\" (UID: \"fbd47f5e-fbd3-42b6-9631-93eae13c275b\") " pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:37.931795 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.931736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sdzs\" (UniqueName: \"kubernetes.io/projected/5524b629-aa78-45d5-95d5-b7a2b5f17b82-kube-api-access-7sdzs\") pod \"iptables-alerter-v7x72\" (UID: \"5524b629-aa78-45d5-95d5-b7a2b5f17b82\") " pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:37.931937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:37.931915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drc4\" (UniqueName: \"kubernetes.io/projected/7f75f6e6-764d-495a-9847-751ba9625fa1-kube-api-access-2drc4\") pod \"aws-ebs-csi-driver-node-fsw8f\" (UID: \"7f75f6e6-764d-495a-9847-751ba9625fa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:38.003906 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.003864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:25:38.013337 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.013309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fnk98" Apr 21 14:25:38.021039 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.021013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgpkp" Apr 21 14:25:38.026197 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.026178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:38.031719 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.031698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwm4s" Apr 21 14:25:38.038400 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.038191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" Apr 21 14:25:38.044729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.044702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bc2h" Apr 21 14:25:38.051257 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.051235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" Apr 21 14:25:38.057809 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.057761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7x72" Apr 21 14:25:38.094745 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.094709 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5524b629_aa78_45d5_95d5_b7a2b5f17b82.slice/crio-8e3d135ca75575d359dc0926a1ad92b088d5b89f91072bb271812adad2670ac2 WatchSource:0}: Error finding container 8e3d135ca75575d359dc0926a1ad92b088d5b89f91072bb271812adad2670ac2: Status 404 returned error can't find the container with id 8e3d135ca75575d359dc0926a1ad92b088d5b89f91072bb271812adad2670ac2 Apr 21 14:25:38.098200 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.098178 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a11169f_b65d_4081_aab8_2e91e9d09a48.slice/crio-eade7eb9e08595919037a61b900083d4aa603796f3e71c24bdd1806e9e5921ed WatchSource:0}: Error finding container eade7eb9e08595919037a61b900083d4aa603796f3e71c24bdd1806e9e5921ed: Status 404 returned error can't find the container with id eade7eb9e08595919037a61b900083d4aa603796f3e71c24bdd1806e9e5921ed Apr 21 14:25:38.100559 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.100534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd47f5e_fbd3_42b6_9631_93eae13c275b.slice/crio-32930fbb394e40997feca1b45557a2030db5e5cfc8442574e3e4ef61b5e4e8aa WatchSource:0}: Error finding container 32930fbb394e40997feca1b45557a2030db5e5cfc8442574e3e4ef61b5e4e8aa: Status 404 returned error can't find the container with id 32930fbb394e40997feca1b45557a2030db5e5cfc8442574e3e4ef61b5e4e8aa Apr 21 14:25:38.102011 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.101875 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77bed861_9016_43ee_825c_b20388894201.slice/crio-3ccde21469233111bdb239b1a519d59a46594c4d54d56bb9884e38ddfa2fb919 WatchSource:0}: Error finding container 3ccde21469233111bdb239b1a519d59a46594c4d54d56bb9884e38ddfa2fb919: Status 404 returned error can't find the container with id 3ccde21469233111bdb239b1a519d59a46594c4d54d56bb9884e38ddfa2fb919 Apr 21 14:25:38.102683 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.102660 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad4357c_863f_4ad8_b101_d0a7bef5fa90.slice/crio-671ded842bd1628221a52307c8d5ee63dee7811bc0b1dfdfe4c2cca42590fe22 WatchSource:0}: Error finding container 671ded842bd1628221a52307c8d5ee63dee7811bc0b1dfdfe4c2cca42590fe22: Status 404 returned error can't find the container with id 671ded842bd1628221a52307c8d5ee63dee7811bc0b1dfdfe4c2cca42590fe22 Apr 21 14:25:38.104352 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.104327 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f75f6e6_764d_495a_9847_751ba9625fa1.slice/crio-88a8837c45d296d878c588b04d7da981de003f36668a22bc635e4b478ce93a02 WatchSource:0}: Error finding container 88a8837c45d296d878c588b04d7da981de003f36668a22bc635e4b478ce93a02: Status 404 returned error can't find the container with id 88a8837c45d296d878c588b04d7da981de003f36668a22bc635e4b478ce93a02 Apr 21 14:25:38.105105 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.104994 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7144a992_05ff_4c23_81b5_3daad4c438a6.slice/crio-4eccb780f7c2151f5fc06d2550e50651b587fb74d7322584be0518bca05ad3ed WatchSource:0}: Error finding container 4eccb780f7c2151f5fc06d2550e50651b587fb74d7322584be0518bca05ad3ed: Status 404 returned error can't find the container with id 4eccb780f7c2151f5fc06d2550e50651b587fb74d7322584be0518bca05ad3ed Apr 21 14:25:38.106758 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.106236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296b2963_2f2f_4c9f_9186_e593ae45021a.slice/crio-6b1e011b46699cc83a9f7146798496929321ac47a710404dc5fd50221b1c4b54 WatchSource:0}: Error finding container 6b1e011b46699cc83a9f7146798496929321ac47a710404dc5fd50221b1c4b54: Status 404 returned error can't find the container with id 6b1e011b46699cc83a9f7146798496929321ac47a710404dc5fd50221b1c4b54 Apr 21 14:25:38.106758 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:25:38.106600 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9148116_ff1f_4dd1_b372_36d40a8132b7.slice/crio-fe85891617a4fcf1b9e360919025a0063cf8396170460c4864e2b38a41c4edcb WatchSource:0}: Error finding container fe85891617a4fcf1b9e360919025a0063cf8396170460c4864e2b38a41c4edcb: Status 404 returned error can't find the container with id fe85891617a4fcf1b9e360919025a0063cf8396170460c4864e2b38a41c4edcb Apr 21 14:25:38.324087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.323972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:38.324087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.324015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:38.324310 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324166 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:38.324310 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324237 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:39.324216123 +0000 UTC m=+3.990236744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:38.324310 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324169 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:38.324310 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324280 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:38.324310 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324297 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:38.324583 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:38.324346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:39.324333422 +0000 UTC m=+3.990354046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:38.744665 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.744606 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:20:36 +0000 UTC" deadline="2028-01-27 05:56:19.104717 +0000 UTC" Apr 21 14:25:38.744665 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.744662 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15495h30m40.360058779s" Apr 21 14:25:38.800068 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.800033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7x72" event={"ID":"5524b629-aa78-45d5-95d5-b7a2b5f17b82","Type":"ContainerStarted","Data":"8e3d135ca75575d359dc0926a1ad92b088d5b89f91072bb271812adad2670ac2"} Apr 21 14:25:38.802482 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.802453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fnk98" event={"ID":"296b2963-2f2f-4c9f-9186-e593ae45021a","Type":"ContainerStarted","Data":"6b1e011b46699cc83a9f7146798496929321ac47a710404dc5fd50221b1c4b54"} Apr 21 14:25:38.803763 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.803724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgpkp" event={"ID":"7144a992-05ff-4c23-81b5-3daad4c438a6","Type":"ContainerStarted","Data":"4eccb780f7c2151f5fc06d2550e50651b587fb74d7322584be0518bca05ad3ed"} Apr 21 14:25:38.804895 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.804870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" event={"ID":"7f75f6e6-764d-495a-9847-751ba9625fa1","Type":"ContainerStarted","Data":"88a8837c45d296d878c588b04d7da981de003f36668a22bc635e4b478ce93a02"} Apr 21 14:25:38.806128 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.806089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerStarted","Data":"671ded842bd1628221a52307c8d5ee63dee7811bc0b1dfdfe4c2cca42590fe22"} Apr 21 14:25:38.810138 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.810089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"eade7eb9e08595919037a61b900083d4aa603796f3e71c24bdd1806e9e5921ed"} Apr 21 14:25:38.812413 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.812392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" event={"ID":"1c841ec6859af6aa2f42f3eb0143c101","Type":"ContainerStarted","Data":"f1b637c69427edf74e081426c2b9c09b5fe4ea8b5c441a7911fe98c3be3bbbe0"} Apr 21 14:25:38.815583 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.815559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwm4s" event={"ID":"a9148116-ff1f-4dd1-b372-36d40a8132b7","Type":"ContainerStarted","Data":"fe85891617a4fcf1b9e360919025a0063cf8396170460c4864e2b38a41c4edcb"} Apr 21 14:25:38.819078 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.819053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4b46l" event={"ID":"77bed861-9016-43ee-825c-b20388894201","Type":"ContainerStarted","Data":"3ccde21469233111bdb239b1a519d59a46594c4d54d56bb9884e38ddfa2fb919"} Apr 21 14:25:38.820922 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:38.820900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bc2h" event={"ID":"fbd47f5e-fbd3-42b6-9631-93eae13c275b","Type":"ContainerStarted","Data":"32930fbb394e40997feca1b45557a2030db5e5cfc8442574e3e4ef61b5e4e8aa"} Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:39.333361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:39.333416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333560 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333622 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:41.333602459 +0000 UTC m=+5.999623101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333704 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333717 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333729 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:39.334225 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.333762 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:41.333751163 +0000 UTC m=+5.999771791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:39.790857 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:39.790823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:39.791326 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.790951 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:39.791326 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:39.791172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:39.791326 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:39.791267 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:40.843127 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:40.843078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" event={"ID":"077d02c6a0c7251870148fbde868cd7c","Type":"ContainerStarted","Data":"4f07ccc81283228d2cb2e21719952e7e08204584484705b3cd27328ab4caf3bb"} Apr 21 14:25:40.861465 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:40.861356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-110.ec2.internal" podStartSLOduration=3.861336802 podStartE2EDuration="3.861336802s" podCreationTimestamp="2026-04-21 14:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:25:38.828040369 +0000 UTC m=+3.494061012" watchObservedRunningTime="2026-04-21 14:25:40.861336802 +0000 UTC m=+5.527357446" Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:41.351376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:41.351434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.351573 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.351634 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:45.351616636 +0000 UTC m=+10.017637262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.352050 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.352071 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.352086 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:41.352178 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.352145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:45.352129015 +0000 UTC m=+10.018149649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:41.791870 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:41.791602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:41.791870 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.791731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:41.791870 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:41.791809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:41.792193 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:41.791948 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:42.847430 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:42.847391 2576 generic.go:358] "Generic (PLEG): container finished" podID="077d02c6a0c7251870148fbde868cd7c" containerID="4f07ccc81283228d2cb2e21719952e7e08204584484705b3cd27328ab4caf3bb" exitCode=0 Apr 21 14:25:42.847891 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:42.847448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" event={"ID":"077d02c6a0c7251870148fbde868cd7c","Type":"ContainerDied","Data":"4f07ccc81283228d2cb2e21719952e7e08204584484705b3cd27328ab4caf3bb"} Apr 21 14:25:43.790888 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:43.790849 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:43.791058 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:43.790867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:43.791058 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:43.790977 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:43.791151 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:43.791065 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:45.391559 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:45.391521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:45.391559 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:45.391576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391695 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391749 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.391734751 +0000 UTC m=+18.057755371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391696 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391794 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391807 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:45.392072 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.391856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.391844686 +0000 UTC m=+18.057865318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:45.791734 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:45.791654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:45.791930 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.791776 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:45.792973 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:45.792949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:45.793078 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:45.793037 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:46.858132 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:46.857874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" event={"ID":"077d02c6a0c7251870148fbde868cd7c","Type":"ContainerStarted","Data":"f257e922b68390c625e41757492fa9b9757a00ad8ee7550bb8a68261c57e0e5c"} Apr 21 14:25:46.859851 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:46.859817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fnk98" event={"ID":"296b2963-2f2f-4c9f-9186-e593ae45021a","Type":"ContainerStarted","Data":"bfd97ba72d2cc8abd34965c66e72f76d15f395a8f47d67e63ba0383b3da91d52"} Apr 21 14:25:46.894106 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:46.893763 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fnk98" podStartSLOduration=3.400340228 podStartE2EDuration="11.893746005s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.128862614 +0000 UTC m=+2.794883249" lastFinishedPulling="2026-04-21 14:25:46.622268403 +0000 UTC m=+11.288289026" observedRunningTime="2026-04-21 14:25:46.893551898 +0000 UTC m=+11.559572552" watchObservedRunningTime="2026-04-21 14:25:46.893746005 +0000 UTC m=+11.559766627" Apr 21 14:25:46.895373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:46.894229 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-110.ec2.internal" podStartSLOduration=9.894220666 podStartE2EDuration="9.894220666s" podCreationTimestamp="2026-04-21 14:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:25:46.874390059 +0000 UTC m=+11.540410704" watchObservedRunningTime="2026-04-21 14:25:46.894220666 +0000 UTC m=+11.560241309" Apr 21 14:25:47.791074 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.791036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:47.791254 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.791036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:47.791254 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:47.791208 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:47.791368 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:47.791345 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:47.863542 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.863472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" event={"ID":"7f75f6e6-764d-495a-9847-751ba9625fa1","Type":"ContainerStarted","Data":"c93e25c731a057cb7048482cc8380afa9cd7092c1eab92539324a14b623abe62"} Apr 21 14:25:47.865602 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.865572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerStarted","Data":"8c72c178cf836c7ab4dce5bfdfd21fdb7c55cd7e46bc3bc477b1d7687a1ff82a"} Apr 21 14:25:47.867627 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.867596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwm4s" event={"ID":"a9148116-ff1f-4dd1-b372-36d40a8132b7","Type":"ContainerStarted","Data":"cfe31904e600a37a02a61c35639cf2c481f95a5a1c3e3ee5ebb6ec87065c8d9b"} Apr 21 14:25:47.869164 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.869135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4b46l" event={"ID":"77bed861-9016-43ee-825c-b20388894201","Type":"ContainerStarted","Data":"7772de80ca4dd8d6b040249607cd7f98c43a5639be8f8e371bcb5c26b65c36d9"} Apr 21 14:25:47.870782 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.870760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bc2h" event={"ID":"fbd47f5e-fbd3-42b6-9631-93eae13c275b","Type":"ContainerStarted","Data":"526e57c55873f294611d01d856118ad7e4069fde4c3d21a096542acd05e4f38d"} Apr 21 14:25:47.906623 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.906567 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nwm4s" podStartSLOduration=3.48868386 podStartE2EDuration="11.906553292s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.128826937 +0000 UTC m=+2.794847565" lastFinishedPulling="2026-04-21 14:25:46.546696373 +0000 UTC m=+11.212716997" observedRunningTime="2026-04-21 14:25:47.906442395 +0000 UTC m=+12.572463040" watchObservedRunningTime="2026-04-21 14:25:47.906553292 +0000 UTC m=+12.572573934" Apr 21 14:25:47.974700 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.974643 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4b46l" podStartSLOduration=4.53131587 podStartE2EDuration="12.974626456s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.103634255 +0000 UTC m=+2.769654875" lastFinishedPulling="2026-04-21 14:25:46.546944824 +0000 UTC m=+11.212965461" observedRunningTime="2026-04-21 14:25:47.974301243 +0000 UTC m=+12.640321885" watchObservedRunningTime="2026-04-21 14:25:47.974626456 +0000 UTC m=+12.640647099" Apr 21 14:25:47.974867 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:47.974834 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5bc2h" podStartSLOduration=3.522239193 podStartE2EDuration="11.974827332s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.102325418 +0000 UTC m=+2.768346052" lastFinishedPulling="2026-04-21 14:25:46.554913366 +0000 UTC m=+11.220934191" observedRunningTime="2026-04-21 14:25:47.924509993 +0000 UTC m=+12.590530649" watchObservedRunningTime="2026-04-21 14:25:47.974827332 +0000 UTC m=+12.640847976" Apr 21 14:25:48.873492 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:48.873410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7x72" event={"ID":"5524b629-aa78-45d5-95d5-b7a2b5f17b82","Type":"ContainerStarted","Data":"e1d9e04e94ff179ed87f7388e2cabd3b8a602b9e77c341fe16a81d91198b1fb8"} Apr 21 14:25:48.889247 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:48.889194 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v7x72" podStartSLOduration=4.439192768 podStartE2EDuration="12.88917488s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.09716324 +0000 UTC m=+2.763183860" lastFinishedPulling="2026-04-21 14:25:46.547145352 +0000 UTC m=+11.213165972" observedRunningTime="2026-04-21 14:25:48.888644382 +0000 UTC m=+13.554665025" watchObservedRunningTime="2026-04-21 14:25:48.88917488 +0000 UTC m=+13.555195526" Apr 21 14:25:49.791079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:49.791046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:49.791079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:49.791066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:49.791326 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:49.791176 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:49.791394 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:49.791340 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:50.878840 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:50.878752 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="8c72c178cf836c7ab4dce5bfdfd21fdb7c55cd7e46bc3bc477b1d7687a1ff82a" exitCode=0 Apr 21 14:25:50.878840 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:50.878813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"8c72c178cf836c7ab4dce5bfdfd21fdb7c55cd7e46bc3bc477b1d7687a1ff82a"} Apr 21 14:25:51.710945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:51.710902 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:51.711738 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:51.711708 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:25:51.791427 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:51.791392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:51.791606 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:51.791517 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:51.791606 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:51.791583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:51.791766 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:51.791724 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:53.454123 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:53.453894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:53.454151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454054 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454224 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454242 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454292 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454307 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:09.454287117 +0000 UTC m=+34.120307750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:53.454590 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.454343 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:09.454328093 +0000 UTC m=+34.120348713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:53.791144 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:53.790927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:53.791144 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:53.790939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:53.791144 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.791061 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:53.791488 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:53.791177 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:55.791328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:55.791301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:55.791745 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:55.791412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:55.791745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:55.791425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:55.791745 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:55.791530 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:55.888801 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:55.888768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgpkp" event={"ID":"7144a992-05ff-4c23-81b5-3daad4c438a6","Type":"ContainerStarted","Data":"1d098118956f1630d05ca55c3f5612fd5d75de2c7589c05a55d27dd3163280db"} Apr 21 14:25:55.890601 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:55.890575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"4aa48e5fe9c84a4f1de8e90ff19e0b970e2b3e069515343cd8f24f65ebb5d2dd"} Apr 21 14:25:55.907952 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:55.907893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hgpkp" podStartSLOduration=3.3997354189999998 podStartE2EDuration="20.907875935s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.128671448 +0000 UTC m=+2.794692083" lastFinishedPulling="2026-04-21 14:25:55.636811974 +0000 UTC m=+20.302832599" observedRunningTime="2026-04-21 14:25:55.906752861 +0000 UTC m=+20.572773504" watchObservedRunningTime="2026-04-21 14:25:55.907875935 +0000 UTC m=+20.573896611" Apr 21 14:25:56.128825 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.128628 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 14:25:56.778678 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.778568 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T14:25:56.12882371Z","UUID":"8b0b6f62-f148-4262-86ba-a4c1c31395bf","Handler":null,"Name":"","Endpoint":""} Apr 21 14:25:56.781953 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.781909 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 14:25:56.781953 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.781943 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 14:25:56.894268 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.894232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" event={"ID":"7f75f6e6-764d-495a-9847-751ba9625fa1","Type":"ContainerStarted","Data":"b1425c7a9b2bee5f2dd71b292032bd2387a7f3d36193c16cdba705eb158023a3"} Apr 21 14:25:56.897882 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.897850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"315d7871bf17392fa4ca31e16785e660ddd01aec2bcabab21212bff4cd17ee5d"} Apr 21 14:25:56.898025 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.897889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"3dbf4ef6d9edd7f578334e1e1b7bceee055d0035a4c0997b7a24735b3c9d12fc"} Apr 21 14:25:56.898025 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.897904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"664906b0accd463b02973b26082ad19f85fffdf47443a4c3f533603c987d88c9"} Apr 21 14:25:56.898025 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.897916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"8d36dd555ae9005e1d6e980855bb2beb472c2a5a59f773c44bf59297239f91f1"} Apr 21 14:25:56.898025 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:56.897929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"85e6d4b2d6e4374202898f260ca10805914aba265c28956575815cae85dc061a"} Apr 21 14:25:57.791703 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:57.791669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:57.791893 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:57.791808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:57.791893 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:57.791868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:57.792008 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:57.791980 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:57.901786 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:57.901751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" event={"ID":"7f75f6e6-764d-495a-9847-751ba9625fa1","Type":"ContainerStarted","Data":"2bcb49600c1be1ed493b39b77f1010a9d81454774e5341d8a6152a639c74b552"} Apr 21 14:25:57.919541 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:57.919493 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsw8f" podStartSLOduration=2.724240803 podStartE2EDuration="21.919479048s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.105998328 +0000 UTC m=+2.772018954" lastFinishedPulling="2026-04-21 14:25:57.301236563 +0000 UTC m=+21.967257199" observedRunningTime="2026-04-21 14:25:57.918936192 +0000 UTC m=+22.584956869" watchObservedRunningTime="2026-04-21 14:25:57.919479048 +0000 UTC m=+22.585499690" Apr 21 14:25:59.791678 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:59.791645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:25:59.792345 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:59.791753 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:25:59.792345 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:59.791858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:25:59.792345 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:25:59.791979 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:25:59.906781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:59.906745 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="0a06a57caf0914c2b079b7b273b9f21faf2266a0178e1e34d5eb80bdfe63102c" exitCode=0 Apr 21 14:25:59.906929 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:59.906813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"0a06a57caf0914c2b079b7b273b9f21faf2266a0178e1e34d5eb80bdfe63102c"} Apr 21 14:25:59.909797 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:25:59.909755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"c1b6a87d5ccff23cb6b4974778c95f6ece4387331cea5034560d0f9ae6ccd0c4"} Apr 21 14:26:01.791189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.790990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:01.791681 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.791021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:01.791681 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:01.791264 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:26:01.791681 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:01.791467 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:26:01.915236 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.915204 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="ef0fe99f65bcc300c6ecb68d112b4d9df5e2285876f7102f1bc1f90236e11d97" exitCode=0 Apr 21 14:26:01.915384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.915273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"ef0fe99f65bcc300c6ecb68d112b4d9df5e2285876f7102f1bc1f90236e11d97"} Apr 21 14:26:01.918490 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.918418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" event={"ID":"1a11169f-b65d-4081-aab8-2e91e9d09a48","Type":"ContainerStarted","Data":"2f2113b276f5263eab82ed3376c0ccd859993f1c2647cb28709dbf01cc2d4bb5"} Apr 21 14:26:01.918728 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.918684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:01.918790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.918739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:01.918790 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.918751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:01.933055 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.933034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:01.933189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.933098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:01.978980 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:01.978930 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" podStartSLOduration=9.442275711 podStartE2EDuration="26.978916321s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.100371844 +0000 UTC m=+2.766392464" lastFinishedPulling="2026-04-21 14:25:55.637012451 +0000 UTC m=+20.303033074" observedRunningTime="2026-04-21 14:26:01.9786815 +0000 UTC m=+26.644702143" watchObservedRunningTime="2026-04-21 14:26:01.978916321 +0000 UTC m=+26.644936962" Apr 21 14:26:03.791421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:03.791390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:03.791775 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:03.791518 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:26:03.791775 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:03.791572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:03.791775 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:03.791715 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:26:03.923974 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:03.923942 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="a2c8b537e0b7e6984a2ef340c2d363b8f12daaaa5bb614752b868c52f7241344" exitCode=0 Apr 21 14:26:03.924243 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:03.924017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"a2c8b537e0b7e6984a2ef340c2d363b8f12daaaa5bb614752b868c52f7241344"} Apr 21 14:26:04.218803 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.218767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64qpt"] Apr 21 14:26:04.219126 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.218883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:04.219126 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:04.218990 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:26:04.219273 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.219129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bc88"] Apr 21 14:26:04.219273 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.219209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:04.219377 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:04.219303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:26:04.544325 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.544269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:26:04.544510 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.544465 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 14:26:04.545023 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:04.545000 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4b46l" Apr 21 14:26:05.792100 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:05.792066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:05.792605 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:05.792197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:26:05.792605 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:05.792243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:05.792605 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:05.792358 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:26:07.723264 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.723080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s2hcn"] Apr 21 14:26:07.750761 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.750729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s2hcn"] Apr 21 14:26:07.750933 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.750857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.750997 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:07.750924 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s2hcn" podUID="14f6c67f-3406-49c7-bc12-9e14ef76f972" Apr 21 14:26:07.790702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.790659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:07.790880 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:07.790801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:26:07.790880 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.790838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:07.791006 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:07.790905 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64qpt" podUID="d952ffe7-f15c-40d1-840a-1201959dee55" Apr 21 14:26:07.864171 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.864140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-kubelet-config\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.864171 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.864185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.864410 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.864308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-dbus\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.964943 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.964905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-kubelet-config\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.965137 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.964964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.965137 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.965042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-dbus\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.965137 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.965047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-kubelet-config\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:07.965137 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:07.965062 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:07.965137 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:07.965141 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret podName:14f6c67f-3406-49c7-bc12-9e14ef76f972 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.465125651 +0000 UTC m=+33.131146271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret") pod "global-pull-secret-syncer-s2hcn" (UID: "14f6c67f-3406-49c7-bc12-9e14ef76f972") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:07.965377 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:07.965325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/14f6c67f-3406-49c7-bc12-9e14ef76f972-dbus\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:08.155692 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.155664 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-110.ec2.internal" event="NodeReady" Apr 21 14:26:08.155852 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.155788 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 14:26:08.206970 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.206933 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x44hx"] Apr 21 14:26:08.224072 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.224034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f5mvw"] Apr 21 14:26:08.224255 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.224224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.228139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.227442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 14:26:08.228139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.227818 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 14:26:08.229426 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.229337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-88srx\"" Apr 21 14:26:08.235958 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.235934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x44hx"] Apr 21 14:26:08.236080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.235966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f5mvw"] Apr 21 14:26:08.236080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.236069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.238309 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.238288 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pqf56\"" Apr 21 14:26:08.238392 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.238330 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 14:26:08.238812 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.238793 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 14:26:08.239054 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.239034 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 14:26:08.367701 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.367701 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91d1fedb-0634-4e6a-97a4-81d44c528d9f-config-volume\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.367938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91d1fedb-0634-4e6a-97a4-81d44c528d9f-tmp-dir\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.367938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb6bk\" (UniqueName: \"kubernetes.io/projected/91d1fedb-0634-4e6a-97a4-81d44c528d9f-kube-api-access-pb6bk\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.367938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxh7\" (UniqueName: \"kubernetes.io/projected/000af909-ad67-432e-8f90-beaf88b66978-kube-api-access-fnxh7\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.367938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.367878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.468486 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:08.468650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91d1fedb-0634-4e6a-97a4-81d44c528d9f-config-volume\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.468650 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.468560 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:08.468650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91d1fedb-0634-4e6a-97a4-81d44c528d9f-tmp-dir\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.468650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb6bk\" (UniqueName: \"kubernetes.io/projected/91d1fedb-0634-4e6a-97a4-81d44c528d9f-kube-api-access-pb6bk\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.468650 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.468625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret podName:14f6c67f-3406-49c7-bc12-9e14ef76f972 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:09.468608172 +0000 UTC m=+34.134628797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret") pod "global-pull-secret-syncer-s2hcn" (UID: "14f6c67f-3406-49c7-bc12-9e14ef76f972") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:08.468889 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxh7\" (UniqueName: \"kubernetes.io/projected/000af909-ad67-432e-8f90-beaf88b66978-kube-api-access-fnxh7\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.468889 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.468889 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.468719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.468889 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.468801 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:08.468889 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.468835 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.968823658 +0000 UTC m=+33.634844282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:08.469069 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.468968 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:08.469069 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.469024 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.969008805 +0000 UTC m=+33.635029431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:08.469214 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.469177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91d1fedb-0634-4e6a-97a4-81d44c528d9f-tmp-dir\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.479650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.479624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb6bk\" (UniqueName: \"kubernetes.io/projected/91d1fedb-0634-4e6a-97a4-81d44c528d9f-kube-api-access-pb6bk\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.479775 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.479724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxh7\" (UniqueName: \"kubernetes.io/projected/000af909-ad67-432e-8f90-beaf88b66978-kube-api-access-fnxh7\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.480143 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.480101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91d1fedb-0634-4e6a-97a4-81d44c528d9f-config-volume\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.971966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.971926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:08.971966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:08.971973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:08.972661 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.972065 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:08.972661 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.972104 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:08.972661 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.972137 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:09.972104126 +0000 UTC m=+34.638124768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:08.972661 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:08.972215 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:09.97219468 +0000 UTC m=+34.638215305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:09.476965 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.476927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:09.477187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.476986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:09.477187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.477020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:09.477187 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477104 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:09.477187 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477160 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477191 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret podName:14f6c67f-3406-49c7-bc12-9e14ef76f972 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:11.477194277 +0000 UTC m=+36.143214918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret") pod "global-pull-secret-syncer-s2hcn" (UID: "14f6c67f-3406-49c7-bc12-9e14ef76f972") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477215 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477231 2576 projected.go:194] Error preparing data for projected volume kube-api-access-2fw8c for pod openshift-network-diagnostics/network-check-target-64qpt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:41.477222961 +0000 UTC m=+66.143243581 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:26:09.477373 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.477292 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c podName:d952ffe7-f15c-40d1-840a-1201959dee55 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:41.477274793 +0000 UTC m=+66.143295418 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fw8c" (UniqueName: "kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c") pod "network-check-target-64qpt" (UID: "d952ffe7-f15c-40d1-840a-1201959dee55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:26:09.791161 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.791071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:09.791324 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.791071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:09.791324 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.791071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:09.794202 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.794180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:26:09.794202 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.794198 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fd928\"" Apr 21 14:26:09.795097 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.795077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:26:09.795210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.795077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:26:09.795210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.795156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9wlmf\"" Apr 21 14:26:09.795210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.795167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 14:26:09.981139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.980970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:09.981139 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:09.981019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:09.981748 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.981736 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:09.981804 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.981755 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:09.981855 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.981811 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:11.981790225 +0000 UTC m=+36.647810845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:09.981855 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:09.981833 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:11.981823781 +0000 UTC m=+36.647844402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:10.939402 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:10.939370 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="ff4781f7c01b876026f6bef3ae645e07b2a85f4999412d2582b3c64d2db58f37" exitCode=0 Apr 21 14:26:10.939640 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:10.939416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"ff4781f7c01b876026f6bef3ae645e07b2a85f4999412d2582b3c64d2db58f37"} Apr 21 14:26:11.494654 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.494610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:11.497036 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.497002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/14f6c67f-3406-49c7-bc12-9e14ef76f972-original-pull-secret\") pod \"global-pull-secret-syncer-s2hcn\" (UID: \"14f6c67f-3406-49c7-bc12-9e14ef76f972\") " pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:11.606781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.606743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s2hcn" Apr 21 14:26:11.777427 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.777200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s2hcn"] Apr 21 14:26:11.780665 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:26:11.780640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f6c67f_3406_49c7_bc12_9e14ef76f972.slice/crio-fea6165496ee07165ff2232bc6a65aa704dd947c0a7baf3f2c77435b4418ecb6 WatchSource:0}: Error finding container fea6165496ee07165ff2232bc6a65aa704dd947c0a7baf3f2c77435b4418ecb6: Status 404 returned error can't find the container with id fea6165496ee07165ff2232bc6a65aa704dd947c0a7baf3f2c77435b4418ecb6 Apr 21 14:26:11.944332 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.944301 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ad4357c-863f-4ad8-b101-d0a7bef5fa90" containerID="3eb5afd7de943cd0219e6df1fe258bd5bd6f0c73ac88bd3022e02ab1e043d8aa" exitCode=0 Apr 21 14:26:11.944472 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.944372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerDied","Data":"3eb5afd7de943cd0219e6df1fe258bd5bd6f0c73ac88bd3022e02ab1e043d8aa"} Apr 21 14:26:11.945638 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.945322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s2hcn" event={"ID":"14f6c67f-3406-49c7-bc12-9e14ef76f972","Type":"ContainerStarted","Data":"fea6165496ee07165ff2232bc6a65aa704dd947c0a7baf3f2c77435b4418ecb6"} Apr 21 14:26:11.998384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.998357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:11.998535 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:11.998391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:11.998535 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:11.998514 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:11.998653 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:11.998559 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:11.998653 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:11.998589 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:15.998565965 +0000 UTC m=+40.664586587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:11.998653 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:11.998611 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:15.998597186 +0000 UTC m=+40.664617806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:12.951378 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:12.951341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" event={"ID":"8ad4357c-863f-4ad8-b101-d0a7bef5fa90","Type":"ContainerStarted","Data":"d8f07195e5583dcad71b5b62321bf0ab1f003697024f6bac8795d388e1bfd053"} Apr 21 14:26:12.975400 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:12.975356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9hpmv" podStartSLOduration=5.151599665 podStartE2EDuration="36.975340922s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:25:38.104513439 +0000 UTC m=+2.770534062" lastFinishedPulling="2026-04-21 14:26:09.928254681 +0000 UTC m=+34.594275319" observedRunningTime="2026-04-21 14:26:12.973948901 +0000 UTC m=+37.639969542" watchObservedRunningTime="2026-04-21 14:26:12.975340922 +0000 UTC m=+37.641361563" Apr 21 14:26:15.958266 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:15.958221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s2hcn" event={"ID":"14f6c67f-3406-49c7-bc12-9e14ef76f972","Type":"ContainerStarted","Data":"0a3b45e3e8b30e0b4a14b8281976229b490e2f1f74e7508dde30af0091fa1ad8"} Apr 21 14:26:15.974003 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:15.973958 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s2hcn" podStartSLOduration=5.459570327 podStartE2EDuration="8.973944536s" podCreationTimestamp="2026-04-21 14:26:07 +0000 UTC" firstStartedPulling="2026-04-21 14:26:11.782363807 +0000 UTC m=+36.448384428" lastFinishedPulling="2026-04-21 14:26:15.296738011 +0000 UTC m=+39.962758637" observedRunningTime="2026-04-21 14:26:15.973309965 +0000 UTC m=+40.639330617" watchObservedRunningTime="2026-04-21 14:26:15.973944536 +0000 UTC m=+40.639965175" Apr 21 14:26:16.027632 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.027599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:16.027744 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.027696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:16.027794 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:16.027741 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:16.027794 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:16.027780 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:16.027880 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:16.027803 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.027788353 +0000 UTC m=+48.693808973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:16.027880 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:16.027831 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.027817601 +0000 UTC m=+48.693838220 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:16.095266 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.095234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc"] Apr 21 14:26:16.096931 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.096916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.099455 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.099426 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 14:26:16.099455 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.099445 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 14:26:16.099632 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.099428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 14:26:16.099632 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.099453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 14:26:16.099632 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.099462 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-wkqth\"" Apr 21 14:26:16.108326 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.108302 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc"] Apr 21 14:26:16.128958 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.128938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfzdb\" (UniqueName: \"kubernetes.io/projected/48ee9e5b-b035-4559-9477-c5c9b335b538-kube-api-access-tfzdb\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.129078 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.128998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48ee9e5b-b035-4559-9477-c5c9b335b538-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.137054 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.137034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg"] Apr 21 14:26:16.138709 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.138693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.141149 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.141107 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 14:26:16.141249 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.141185 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 14:26:16.141249 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.141189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 14:26:16.141249 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.141192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 14:26:16.152156 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.152135 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg"] Apr 21 14:26:16.230100 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.229998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230100 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4jk\" (UniqueName: \"kubernetes.io/projected/0984855a-0ffe-408b-ba32-08b6b8ead44e-kube-api-access-4w4jk\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48ee9e5b-b035-4559-9477-c5c9b335b538-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0984855a-0ffe-408b-ba32-08b6b8ead44e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.230359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.230344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfzdb\" (UniqueName: \"kubernetes.io/projected/48ee9e5b-b035-4559-9477-c5c9b335b538-kube-api-access-tfzdb\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.234433 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.234411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48ee9e5b-b035-4559-9477-c5c9b335b538-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.238678 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.238645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfzdb\" (UniqueName: \"kubernetes.io/projected/48ee9e5b-b035-4559-9477-c5c9b335b538-kube-api-access-tfzdb\") pod \"managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc\" (UID: \"48ee9e5b-b035-4559-9477-c5c9b335b538\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.331822 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.331765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.332013 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.331844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4jk\" (UniqueName: \"kubernetes.io/projected/0984855a-0ffe-408b-ba32-08b6b8ead44e-kube-api-access-4w4jk\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.332729 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.332700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0984855a-0ffe-408b-ba32-08b6b8ead44e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.333155 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.333130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0984855a-0ffe-408b-ba32-08b6b8ead44e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.333241 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.333196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.333294 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.333286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.333343 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.333319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.336098 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.336070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.336220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.336163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.337028 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.337011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-ca\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.337155 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.337138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0984855a-0ffe-408b-ba32-08b6b8ead44e-hub\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.341580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.341561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4jk\" (UniqueName: \"kubernetes.io/projected/0984855a-0ffe-408b-ba32-08b6b8ead44e-kube-api-access-4w4jk\") pod \"cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg\" (UID: \"0984855a-0ffe-408b-ba32-08b6b8ead44e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.416746 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.416700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" Apr 21 14:26:16.446349 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.446310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:26:16.552820 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.552786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc"] Apr 21 14:26:16.558583 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:26:16.558556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ee9e5b_b035_4559_9477_c5c9b335b538.slice/crio-3c1684b5e2ae73a2f965b99e33078cdb8ef1232ac3e2e3325fdc9dcdc837a2bb WatchSource:0}: Error finding container 3c1684b5e2ae73a2f965b99e33078cdb8ef1232ac3e2e3325fdc9dcdc837a2bb: Status 404 returned error can't find the container with id 3c1684b5e2ae73a2f965b99e33078cdb8ef1232ac3e2e3325fdc9dcdc837a2bb Apr 21 14:26:16.583560 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.583529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg"] Apr 21 14:26:16.586760 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:26:16.586734 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0984855a_0ffe_408b_ba32_08b6b8ead44e.slice/crio-6e7dd65b2c15f8c6e8d39939af126b4105d52153f8f53429a020abd86f9b0289 WatchSource:0}: Error finding container 6e7dd65b2c15f8c6e8d39939af126b4105d52153f8f53429a020abd86f9b0289: Status 404 returned error can't find the container with id 6e7dd65b2c15f8c6e8d39939af126b4105d52153f8f53429a020abd86f9b0289 Apr 21 14:26:16.961408 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.961375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerStarted","Data":"6e7dd65b2c15f8c6e8d39939af126b4105d52153f8f53429a020abd86f9b0289"} Apr 21 14:26:16.962427 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:16.962399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" event={"ID":"48ee9e5b-b035-4559-9477-c5c9b335b538","Type":"ContainerStarted","Data":"3c1684b5e2ae73a2f965b99e33078cdb8ef1232ac3e2e3325fdc9dcdc837a2bb"} Apr 21 14:26:19.970582 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:19.970311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerStarted","Data":"32367d3fe86ca849deeaedd5e5481cdcb33df20ee6c2bf8ab842ee1b4f160f7b"} Apr 21 14:26:19.971430 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:19.971408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" event={"ID":"48ee9e5b-b035-4559-9477-c5c9b335b538","Type":"ContainerStarted","Data":"12f5212fdf75b9091179f05c3346774fed4c378c4a507421c5675a076fbe1455"} Apr 21 14:26:19.986597 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:19.986555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" podStartSLOduration=0.825842627 podStartE2EDuration="3.98654234s" podCreationTimestamp="2026-04-21 14:26:16 +0000 UTC" firstStartedPulling="2026-04-21 14:26:16.560377873 +0000 UTC m=+41.226398496" lastFinishedPulling="2026-04-21 14:26:19.721077571 +0000 UTC m=+44.387098209" observedRunningTime="2026-04-21 14:26:19.986023018 +0000 UTC m=+44.652043657" watchObservedRunningTime="2026-04-21 14:26:19.98654234 +0000 UTC m=+44.652562981" Apr 21 14:26:21.977541 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:21.977504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerStarted","Data":"ca9684902777024f8b7a4feb0c9b5611c5ab7055cc775fcd59430c274a52cbce"} Apr 21 14:26:21.977541 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:21.977545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerStarted","Data":"82dfc54bcfc02a6a972cadbad42cbbf7c6ad24f920d38da748e9e1e1686886db"} Apr 21 14:26:21.999360 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:21.999274 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" podStartSLOduration=0.839493126 podStartE2EDuration="5.999259563s" podCreationTimestamp="2026-04-21 14:26:16 +0000 UTC" firstStartedPulling="2026-04-21 14:26:16.588439842 +0000 UTC m=+41.254460473" lastFinishedPulling="2026-04-21 14:26:21.748206276 +0000 UTC m=+46.414226910" observedRunningTime="2026-04-21 14:26:21.998671574 +0000 UTC m=+46.664692216" watchObservedRunningTime="2026-04-21 14:26:21.999259563 +0000 UTC m=+46.665280206" Apr 21 14:26:24.094188 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:24.094144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:24.094188 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:24.094191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:24.094763 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:24.094276 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:24.094763 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:24.094277 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:24.094763 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:24.094328 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:40.09431408 +0000 UTC m=+64.760334700 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:24.094763 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:24.094341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:26:40.094335177 +0000 UTC m=+64.760355796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:33.939549 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:33.939519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tn95v" Apr 21 14:26:40.108244 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:40.108202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:26:40.108244 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:40.108245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:26:40.108641 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:40.108358 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:40.108641 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:40.108410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:27:12.108397422 +0000 UTC m=+96.774418042 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:26:40.108641 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:40.108358 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:40.108641 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:40.108482 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:27:12.108469492 +0000 UTC m=+96.774490126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:26:41.516661 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.516624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:41.516661 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.516666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:26:41.520347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.520325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:26:41.520456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.520437 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:26:41.527140 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:41.527099 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:26:41.527195 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:26:41.527184 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:27:45.527167086 +0000 UTC m=+130.193187710 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : secret "metrics-daemon-secret" not found Apr 21 14:26:41.529349 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.529331 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:26:41.541289 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.541270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fw8c\" (UniqueName: \"kubernetes.io/projected/d952ffe7-f15c-40d1-840a-1201959dee55-kube-api-access-2fw8c\") pod \"network-check-target-64qpt\" (UID: \"d952ffe7-f15c-40d1-840a-1201959dee55\") " pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:41.604262 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.604230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fd928\"" Apr 21 14:26:41.612336 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.612316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:41.725252 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:41.725224 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64qpt"] Apr 21 14:26:41.728620 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:26:41.728595 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd952ffe7_f15c_40d1_840a_1201959dee55.slice/crio-736047aaa36bde41632c64054e88d7f8acdbb1afc9c5d42fd4b01646244ab739 WatchSource:0}: Error finding container 736047aaa36bde41632c64054e88d7f8acdbb1afc9c5d42fd4b01646244ab739: Status 404 returned error can't find the container with id 736047aaa36bde41632c64054e88d7f8acdbb1afc9c5d42fd4b01646244ab739 Apr 21 14:26:42.020479 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:42.020444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64qpt" event={"ID":"d952ffe7-f15c-40d1-840a-1201959dee55","Type":"ContainerStarted","Data":"736047aaa36bde41632c64054e88d7f8acdbb1afc9c5d42fd4b01646244ab739"} Apr 21 14:26:45.028276 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:45.028247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64qpt" event={"ID":"d952ffe7-f15c-40d1-840a-1201959dee55","Type":"ContainerStarted","Data":"d1bf324449ac3ca67d5c95f32b5bca5f544ec315fcc5d599a1a2b6e0c0655ee8"} Apr 21 14:26:45.028695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:45.028362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:26:45.043590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:26:45.043545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-64qpt" podStartSLOduration=66.849700187 podStartE2EDuration="1m10.043532608s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:26:41.73102836 +0000 UTC m=+66.397048980" lastFinishedPulling="2026-04-21 14:26:44.924860778 +0000 UTC m=+69.590881401" observedRunningTime="2026-04-21 14:26:45.042668217 +0000 UTC m=+69.708688856" watchObservedRunningTime="2026-04-21 14:26:45.043532608 +0000 UTC m=+69.709553271" Apr 21 14:27:12.152951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:12.152793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:27:12.152951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:12.152858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:27:12.152951 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:12.152942 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:27:12.153521 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:12.152981 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:27:12.153521 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:12.153017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls podName:91d1fedb-0634-4e6a-97a4-81d44c528d9f nodeName:}" failed. No retries permitted until 2026-04-21 14:28:16.153001317 +0000 UTC m=+160.819021937 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls") pod "dns-default-x44hx" (UID: "91d1fedb-0634-4e6a-97a4-81d44c528d9f") : secret "dns-default-metrics-tls" not found Apr 21 14:27:12.153521 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:12.153031 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert podName:000af909-ad67-432e-8f90-beaf88b66978 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:16.153025162 +0000 UTC m=+160.819045782 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert") pod "ingress-canary-f5mvw" (UID: "000af909-ad67-432e-8f90-beaf88b66978") : secret "canary-serving-cert" not found Apr 21 14:27:16.034621 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:16.034594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-64qpt" Apr 21 14:27:41.074572 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:41.074544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bc2h_fbd47f5e-fbd3-42b6-9631-93eae13c275b/dns-node-resolver/0.log" Apr 21 14:27:42.277041 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:42.276988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwm4s_a9148116-ff1f-4dd1-b372-36d40a8132b7/node-ca/0.log" Apr 21 14:27:45.586131 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:27:45.586076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:27:45.586492 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:45.586248 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:27:45.586492 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:27:45.586330 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs podName:bbebdf2e-6e19-434f-9e3e-bb9880123092 nodeName:}" failed. No retries permitted until 2026-04-21 14:29:47.586312959 +0000 UTC m=+252.252333583 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs") pod "network-metrics-daemon-5bc88" (UID: "bbebdf2e-6e19-434f-9e3e-bb9880123092") : secret "metrics-daemon-secret" not found Apr 21 14:28:11.238583 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:11.238541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-x44hx" podUID="91d1fedb-0634-4e6a-97a4-81d44c528d9f" Apr 21 14:28:11.246756 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:11.246707 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-f5mvw" podUID="000af909-ad67-432e-8f90-beaf88b66978" Apr 21 14:28:12.230321 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.230287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:12.557530 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.557451 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2shrn"] Apr 21 14:28:12.559305 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.559280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.561523 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.561503 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 14:28:12.561614 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.561534 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 14:28:12.562369 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.562352 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 14:28:12.562432 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.562404 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qlm9d\"" Apr 21 14:28:12.562481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.562440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 14:28:12.572626 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.572604 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2shrn"] Apr 21 14:28:12.683617 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.683582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/091d8da6-1925-4483-b412-411d3ab3ec20-data-volume\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.683617 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.683618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/091d8da6-1925-4483-b412-411d3ab3ec20-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.683813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.683668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/091d8da6-1925-4483-b412-411d3ab3ec20-crio-socket\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.683813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.683751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4z6\" (UniqueName: \"kubernetes.io/projected/091d8da6-1925-4483-b412-411d3ab3ec20-kube-api-access-px4z6\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.683813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.683775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/091d8da6-1925-4483-b412-411d3ab3ec20-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.784712 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/091d8da6-1925-4483-b412-411d3ab3ec20-data-volume\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.784712 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/091d8da6-1925-4483-b412-411d3ab3ec20-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.784908 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/091d8da6-1925-4483-b412-411d3ab3ec20-crio-socket\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.784951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px4z6\" (UniqueName: \"kubernetes.io/projected/091d8da6-1925-4483-b412-411d3ab3ec20-kube-api-access-px4z6\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.784951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/091d8da6-1925-4483-b412-411d3ab3ec20-crio-socket\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.785022 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.784961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/091d8da6-1925-4483-b412-411d3ab3ec20-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.785077 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.785059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/091d8da6-1925-4483-b412-411d3ab3ec20-data-volume\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.785399 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.785377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/091d8da6-1925-4483-b412-411d3ab3ec20-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.786948 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.786932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/091d8da6-1925-4483-b412-411d3ab3ec20-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.795975 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.795949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4z6\" (UniqueName: \"kubernetes.io/projected/091d8da6-1925-4483-b412-411d3ab3ec20-kube-api-access-px4z6\") pod \"insights-runtime-extractor-2shrn\" (UID: \"091d8da6-1925-4483-b412-411d3ab3ec20\") " pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.811970 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:12.811900 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5bc88" podUID="bbebdf2e-6e19-434f-9e3e-bb9880123092" Apr 21 14:28:12.868523 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.868497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2shrn" Apr 21 14:28:12.988154 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:12.988125 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2shrn"] Apr 21 14:28:12.991421 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:12.991388 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod091d8da6_1925_4483_b412_411d3ab3ec20.slice/crio-e8c9ce6513275fc9806ef1df8efbaff8f6c5b044978ed42f04519999307eae8a WatchSource:0}: Error finding container e8c9ce6513275fc9806ef1df8efbaff8f6c5b044978ed42f04519999307eae8a: Status 404 returned error can't find the container with id e8c9ce6513275fc9806ef1df8efbaff8f6c5b044978ed42f04519999307eae8a Apr 21 14:28:13.233936 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:13.233899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2shrn" event={"ID":"091d8da6-1925-4483-b412-411d3ab3ec20","Type":"ContainerStarted","Data":"93dbaee6224cd3ee0d139b60af6a7132837186110d65572340eb8864c6d59eed"} Apr 21 14:28:13.233936 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:13.233934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2shrn" event={"ID":"091d8da6-1925-4483-b412-411d3ab3ec20","Type":"ContainerStarted","Data":"e8c9ce6513275fc9806ef1df8efbaff8f6c5b044978ed42f04519999307eae8a"} Apr 21 14:28:14.237798 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:14.237748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2shrn" event={"ID":"091d8da6-1925-4483-b412-411d3ab3ec20","Type":"ContainerStarted","Data":"93951341dd76960a5a4f70bc1f8f68fc4f3516e4a3b960893601f46745828100"} Apr 21 14:28:16.215206 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.215173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:16.215206 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.215210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:28:16.217599 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.217572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91d1fedb-0634-4e6a-97a4-81d44c528d9f-metrics-tls\") pod \"dns-default-x44hx\" (UID: \"91d1fedb-0634-4e6a-97a4-81d44c528d9f\") " pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:16.217711 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.217690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/000af909-ad67-432e-8f90-beaf88b66978-cert\") pod \"ingress-canary-f5mvw\" (UID: \"000af909-ad67-432e-8f90-beaf88b66978\") " pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:28:16.243786 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.243756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2shrn" event={"ID":"091d8da6-1925-4483-b412-411d3ab3ec20","Type":"ContainerStarted","Data":"263179429f3df09b2f8bc26c8ae8e48a02cb48f843a4396e268839db49c35573"} Apr 21 14:28:16.262203 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.262158 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2shrn" podStartSLOduration=1.8972771229999998 podStartE2EDuration="4.262144534s" podCreationTimestamp="2026-04-21 14:28:12 +0000 UTC" firstStartedPulling="2026-04-21 14:28:13.050171336 +0000 UTC m=+157.716191967" lastFinishedPulling="2026-04-21 14:28:15.415038755 +0000 UTC m=+160.081059378" observedRunningTime="2026-04-21 14:28:16.261707383 +0000 UTC m=+160.927728026" watchObservedRunningTime="2026-04-21 14:28:16.262144534 +0000 UTC m=+160.928165176" Apr 21 14:28:16.433043 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.433014 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-88srx\"" Apr 21 14:28:16.441035 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.441017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:16.554506 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:16.554475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x44hx"] Apr 21 14:28:16.558303 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:16.558268 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91d1fedb_0634_4e6a_97a4_81d44c528d9f.slice/crio-4ff427456dbb8554b701e2864d920ab012c65819d406e285b043e6c8d25e46ec WatchSource:0}: Error finding container 4ff427456dbb8554b701e2864d920ab012c65819d406e285b043e6c8d25e46ec: Status 404 returned error can't find the container with id 4ff427456dbb8554b701e2864d920ab012c65819d406e285b043e6c8d25e46ec Apr 21 14:28:17.247106 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:17.247068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x44hx" event={"ID":"91d1fedb-0634-4e6a-97a4-81d44c528d9f","Type":"ContainerStarted","Data":"4ff427456dbb8554b701e2864d920ab012c65819d406e285b043e6c8d25e46ec"} Apr 21 14:28:18.251027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:18.250991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x44hx" event={"ID":"91d1fedb-0634-4e6a-97a4-81d44c528d9f","Type":"ContainerStarted","Data":"b2212b8603ff7712210754848f3256206ada5f23a5a0e48d7b39bf7ee1abc139"} Apr 21 14:28:18.251027 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:18.251028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x44hx" event={"ID":"91d1fedb-0634-4e6a-97a4-81d44c528d9f","Type":"ContainerStarted","Data":"827b0f12471842ad92c93ae48ac156014ea0907522c5ad5ea8829b256cbdaf58"} Apr 21 14:28:18.251503 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:18.251142 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:18.268017 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:18.267967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x44hx" podStartSLOduration=129.145968858 podStartE2EDuration="2m10.267949893s" podCreationTimestamp="2026-04-21 14:26:08 +0000 UTC" firstStartedPulling="2026-04-21 14:28:16.560257006 +0000 UTC m=+161.226277630" lastFinishedPulling="2026-04-21 14:28:17.682238046 +0000 UTC m=+162.348258665" observedRunningTime="2026-04-21 14:28:18.266858933 +0000 UTC m=+162.932879577" watchObservedRunningTime="2026-04-21 14:28:18.267949893 +0000 UTC m=+162.933970532" Apr 21 14:28:19.192394 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.192357 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96"] Apr 21 14:28:19.194407 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.194392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.197238 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.197215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 14:28:19.198035 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.198008 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 14:28:19.198035 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.198028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 14:28:19.198207 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.198008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-q6bk6\"" Apr 21 14:28:19.198303 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.198291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 14:28:19.198749 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.198732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 14:28:19.205607 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.205584 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcqk5"] Apr 21 14:28:19.207465 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.207441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96"] Apr 21 14:28:19.207578 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.207564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.209873 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.209852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 14:28:19.209974 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.209901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 14:28:19.210054 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.210034 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-sbzd2\"" Apr 21 14:28:19.210178 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.210163 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 14:28:19.220381 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.220361 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcqk5"] Apr 21 14:28:19.222907 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.222859 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cg4jg"] Apr 21 14:28:19.224787 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.224767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.226836 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.226816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 14:28:19.226921 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.226906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 14:28:19.227079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.227066 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 14:28:19.227397 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.227383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k54lk\"" Apr 21 14:28:19.341431 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-root\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-wtmp\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c88915e8-596e-4a44-94d4-04a3d507a949-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8xx\" (UniqueName: \"kubernetes.io/projected/c88915e8-596e-4a44-94d4-04a3d507a949-kube-api-access-8s8xx\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hjm\" (UniqueName: \"kubernetes.io/projected/88e9e88c-e0de-4c90-9d5f-510016098205-kube-api-access-m4hjm\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.341883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmj2\" (UniqueName: \"kubernetes.io/projected/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-kube-api-access-kvmj2\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.341982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-sys\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.342040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-textfile\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.342059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88e9e88c-e0de-4c90-9d5f-510016098205-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.342088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-metrics-client-ca\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.342145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.342456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.342195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443512 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88e9e88c-e0de-4c90-9d5f-510016098205-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.443512 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-metrics-client-ca\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443512 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-root\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-wtmp\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-root\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c88915e8-596e-4a44-94d4-04a3d507a949-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.443781 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8xx\" (UniqueName: \"kubernetes.io/projected/c88915e8-596e-4a44-94d4-04a3d507a949-kube-api-access-8s8xx\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hjm\" (UniqueName: \"kubernetes.io/projected/88e9e88c-e0de-4c90-9d5f-510016098205-kube-api-access-m4hjm\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88e9e88c-e0de-4c90-9d5f-510016098205-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmj2\" (UniqueName: \"kubernetes.io/projected/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-kube-api-access-kvmj2\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.443998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-sys\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-textfile\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-metrics-client-ca\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:19.444308 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-textfile\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:19.444365 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls podName:6d7bd865-ba9b-41eb-88f9-f4701a3c1880 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:19.94434657 +0000 UTC m=+164.610367203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls") pod "node-exporter-cg4jg" (UID: "6d7bd865-ba9b-41eb-88f9-f4701a3c1880") : secret "node-exporter-tls" not found Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-sys\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c88915e8-596e-4a44-94d4-04a3d507a949-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.444469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-wtmp\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:19.444551 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 14:28:19.444704 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:28:19.444647 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls podName:88e9e88c-e0de-4c90-9d5f-510016098205 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:19.94463011 +0000 UTC m=+164.610650737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-gcqk5" (UID: "88e9e88c-e0de-4c90-9d5f-510016098205") : secret "kube-state-metrics-tls" not found Apr 21 14:28:19.445182 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.445002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.446421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.446394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.446555 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.446520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.446859 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.446834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.446859 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.446851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c88915e8-596e-4a44-94d4-04a3d507a949-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.452034 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.452011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hjm\" (UniqueName: \"kubernetes.io/projected/88e9e88c-e0de-4c90-9d5f-510016098205-kube-api-access-m4hjm\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.452631 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.452590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8xx\" (UniqueName: \"kubernetes.io/projected/c88915e8-596e-4a44-94d4-04a3d507a949-kube-api-access-8s8xx\") pod \"openshift-state-metrics-9d44df66c-zzz96\" (UID: \"c88915e8-596e-4a44-94d4-04a3d507a949\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.452846 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.452824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmj2\" (UniqueName: \"kubernetes.io/projected/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-kube-api-access-kvmj2\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.503317 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.503284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" Apr 21 14:28:19.615975 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.615943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96"] Apr 21 14:28:19.619196 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:19.619168 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc88915e8_596e_4a44_94d4_04a3d507a949.slice/crio-31107b40e7e56ff18d0d65b84ba2ec07a1e9c121fad426932a339cd272a125d8 WatchSource:0}: Error finding container 31107b40e7e56ff18d0d65b84ba2ec07a1e9c121fad426932a339cd272a125d8: Status 404 returned error can't find the container with id 31107b40e7e56ff18d0d65b84ba2ec07a1e9c121fad426932a339cd272a125d8 Apr 21 14:28:19.948192 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.948096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.948323 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.948189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:19.950404 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.950373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7bd865-ba9b-41eb-88f9-f4701a3c1880-node-exporter-tls\") pod \"node-exporter-cg4jg\" (UID: \"6d7bd865-ba9b-41eb-88f9-f4701a3c1880\") " pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:19.950510 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:19.950420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88e9e88c-e0de-4c90-9d5f-510016098205-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gcqk5\" (UID: \"88e9e88c-e0de-4c90-9d5f-510016098205\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:20.115909 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.115873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" Apr 21 14:28:20.132670 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.132637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cg4jg" Apr 21 14:28:20.142869 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:20.142835 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d7bd865_ba9b_41eb_88f9_f4701a3c1880.slice/crio-d96b63e06fcbff67d94f1b85a366dccc394c971c980f31f5b9588d9d244f8b22 WatchSource:0}: Error finding container d96b63e06fcbff67d94f1b85a366dccc394c971c980f31f5b9588d9d244f8b22: Status 404 returned error can't find the container with id d96b63e06fcbff67d94f1b85a366dccc394c971c980f31f5b9588d9d244f8b22 Apr 21 14:28:20.234266 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.234180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gcqk5"] Apr 21 14:28:20.237477 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:20.237452 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e9e88c_e0de_4c90_9d5f_510016098205.slice/crio-95379ea6f92cd08a1870df0dab38776d3dea155bd09175c7a767efb2b469a9f4 WatchSource:0}: Error finding container 95379ea6f92cd08a1870df0dab38776d3dea155bd09175c7a767efb2b469a9f4: Status 404 returned error can't find the container with id 95379ea6f92cd08a1870df0dab38776d3dea155bd09175c7a767efb2b469a9f4 Apr 21 14:28:20.262951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.262915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" event={"ID":"c88915e8-596e-4a44-94d4-04a3d507a949","Type":"ContainerStarted","Data":"8f57b0b8912fc2b5cd6acb629ea81010ac3f50bf2cfdee6b4df80a132c034124"} Apr 21 14:28:20.262951 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.262954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" event={"ID":"c88915e8-596e-4a44-94d4-04a3d507a949","Type":"ContainerStarted","Data":"af0b547a134ac3a2fccba9a68fbadfaed84b34e846f11a31b41639485dddcae3"} Apr 21 14:28:20.263178 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.262967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" event={"ID":"c88915e8-596e-4a44-94d4-04a3d507a949","Type":"ContainerStarted","Data":"31107b40e7e56ff18d0d65b84ba2ec07a1e9c121fad426932a339cd272a125d8"} Apr 21 14:28:20.263980 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.263954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg4jg" event={"ID":"6d7bd865-ba9b-41eb-88f9-f4701a3c1880","Type":"ContainerStarted","Data":"d96b63e06fcbff67d94f1b85a366dccc394c971c980f31f5b9588d9d244f8b22"} Apr 21 14:28:20.264903 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.264876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" event={"ID":"88e9e88c-e0de-4c90-9d5f-510016098205","Type":"ContainerStarted","Data":"95379ea6f92cd08a1870df0dab38776d3dea155bd09175c7a767efb2b469a9f4"} Apr 21 14:28:20.266066 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.266044 2576 generic.go:358] "Generic (PLEG): container finished" podID="48ee9e5b-b035-4559-9477-c5c9b335b538" containerID="12f5212fdf75b9091179f05c3346774fed4c378c4a507421c5675a076fbe1455" exitCode=255 Apr 21 14:28:20.266180 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.266079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" event={"ID":"48ee9e5b-b035-4559-9477-c5c9b335b538","Type":"ContainerDied","Data":"12f5212fdf75b9091179f05c3346774fed4c378c4a507421c5675a076fbe1455"} Apr 21 14:28:20.266422 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:20.266406 2576 scope.go:117] "RemoveContainer" containerID="12f5212fdf75b9091179f05c3346774fed4c378c4a507421c5675a076fbe1455" Apr 21 14:28:21.270127 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:21.270080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbc9dc6dc-bstgc" event={"ID":"48ee9e5b-b035-4559-9477-c5c9b335b538","Type":"ContainerStarted","Data":"1356d742e4642e49cee7c4caf9fc279ef428ca71ae5241f95b029fc6c4dd9703"} Apr 21 14:28:21.272079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:21.272047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" event={"ID":"c88915e8-596e-4a44-94d4-04a3d507a949","Type":"ContainerStarted","Data":"e2f7311978a52c3304bff8ff1cf9fcfdf9e626be960ddd6f428288ef462feea8"} Apr 21 14:28:21.273546 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:21.273518 2576 generic.go:358] "Generic (PLEG): container finished" podID="6d7bd865-ba9b-41eb-88f9-f4701a3c1880" containerID="8f6a60cefa8e83384405c45a6e21d746d4dbcc90eaf040fff4fa1f8464a338d3" exitCode=0 Apr 21 14:28:21.273673 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:21.273549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg4jg" event={"ID":"6d7bd865-ba9b-41eb-88f9-f4701a3c1880","Type":"ContainerDied","Data":"8f6a60cefa8e83384405c45a6e21d746d4dbcc90eaf040fff4fa1f8464a338d3"} Apr 21 14:28:21.300764 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:21.300712 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zzz96" podStartSLOduration=1.303131378 podStartE2EDuration="2.300695519s" podCreationTimestamp="2026-04-21 14:28:19 +0000 UTC" firstStartedPulling="2026-04-21 14:28:19.721497106 +0000 UTC m=+164.387517730" lastFinishedPulling="2026-04-21 14:28:20.719061235 +0000 UTC m=+165.385081871" observedRunningTime="2026-04-21 14:28:21.300495031 +0000 UTC m=+165.966515676" watchObservedRunningTime="2026-04-21 14:28:21.300695519 +0000 UTC m=+165.966716161" Apr 21 14:28:22.278517 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.278436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg4jg" event={"ID":"6d7bd865-ba9b-41eb-88f9-f4701a3c1880","Type":"ContainerStarted","Data":"c55d9352ba54ba36443fbc7ec47d44641d7865f4eeb89d012e92d32525401b95"} Apr 21 14:28:22.278517 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.278472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg4jg" event={"ID":"6d7bd865-ba9b-41eb-88f9-f4701a3c1880","Type":"ContainerStarted","Data":"a9eddfdb9622a7d0e689e1021d8f1a5603f0c7dff72ab48b029432fc0d29f60b"} Apr 21 14:28:22.280601 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.280575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" event={"ID":"88e9e88c-e0de-4c90-9d5f-510016098205","Type":"ContainerStarted","Data":"fffae676ccedf9036bbf4da14c0821d6d5e3b5b84111bbfd7549939639e42df9"} Apr 21 14:28:22.280719 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.280605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" event={"ID":"88e9e88c-e0de-4c90-9d5f-510016098205","Type":"ContainerStarted","Data":"3d460bafb9763f35d2cb3a5fe104d0960d245913e790dacb98e5951bca03cda8"} Apr 21 14:28:22.280719 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.280616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" event={"ID":"88e9e88c-e0de-4c90-9d5f-510016098205","Type":"ContainerStarted","Data":"73949f64928ab7dff9c794df758e2d066d9c93fae8306a9854b648b411feab50"} Apr 21 14:28:22.298809 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.298753 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cg4jg" podStartSLOduration=2.382131193 podStartE2EDuration="3.298737701s" podCreationTimestamp="2026-04-21 14:28:19 +0000 UTC" firstStartedPulling="2026-04-21 14:28:20.145400948 +0000 UTC m=+164.811421570" lastFinishedPulling="2026-04-21 14:28:21.06200745 +0000 UTC m=+165.728028078" observedRunningTime="2026-04-21 14:28:22.297638583 +0000 UTC m=+166.963659226" watchObservedRunningTime="2026-04-21 14:28:22.298737701 +0000 UTC m=+166.964758345" Apr 21 14:28:22.316702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.316654 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-gcqk5" podStartSLOduration=1.793888355 podStartE2EDuration="3.316638811s" podCreationTimestamp="2026-04-21 14:28:19 +0000 UTC" firstStartedPulling="2026-04-21 14:28:20.239201747 +0000 UTC m=+164.905222366" lastFinishedPulling="2026-04-21 14:28:21.761952201 +0000 UTC m=+166.427972822" observedRunningTime="2026-04-21 14:28:22.314954053 +0000 UTC m=+166.980974711" watchObservedRunningTime="2026-04-21 14:28:22.316638811 +0000 UTC m=+166.982659450" Apr 21 14:28:22.790799 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.790772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:28:22.793051 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.793028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pqf56\"" Apr 21 14:28:22.801101 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.801074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f5mvw" Apr 21 14:28:22.916906 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:22.916877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f5mvw"] Apr 21 14:28:22.919742 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:22.919713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000af909_ad67_432e_8f90_beaf88b66978.slice/crio-bcbb88e54c4f86b306831d712c400e609374c33de9ed3ed2c50bebcceb79999b WatchSource:0}: Error finding container bcbb88e54c4f86b306831d712c400e609374c33de9ed3ed2c50bebcceb79999b: Status 404 returned error can't find the container with id bcbb88e54c4f86b306831d712c400e609374c33de9ed3ed2c50bebcceb79999b Apr 21 14:28:23.245034 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.245000 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-b4hc4"] Apr 21 14:28:23.249429 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.249411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:23.251596 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.251571 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 14:28:23.251702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.251639 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wzn5n\"" Apr 21 14:28:23.251769 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.251754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 14:28:23.256619 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.256591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b4hc4"] Apr 21 14:28:23.284280 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.284237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f5mvw" event={"ID":"000af909-ad67-432e-8f90-beaf88b66978","Type":"ContainerStarted","Data":"bcbb88e54c4f86b306831d712c400e609374c33de9ed3ed2c50bebcceb79999b"} Apr 21 14:28:23.377627 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.377576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8g8t\" (UniqueName: \"kubernetes.io/projected/b3417a6b-baaf-488b-99aa-0815ee21a3b9-kube-api-access-r8g8t\") pod \"downloads-6bcc868b7-b4hc4\" (UID: \"b3417a6b-baaf-488b-99aa-0815ee21a3b9\") " pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:23.478520 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.478479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8g8t\" (UniqueName: \"kubernetes.io/projected/b3417a6b-baaf-488b-99aa-0815ee21a3b9-kube-api-access-r8g8t\") pod \"downloads-6bcc868b7-b4hc4\" (UID: \"b3417a6b-baaf-488b-99aa-0815ee21a3b9\") " pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:23.486658 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.486634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8g8t\" (UniqueName: \"kubernetes.io/projected/b3417a6b-baaf-488b-99aa-0815ee21a3b9-kube-api-access-r8g8t\") pod \"downloads-6bcc868b7-b4hc4\" (UID: \"b3417a6b-baaf-488b-99aa-0815ee21a3b9\") " pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:23.563229 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.563142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:23.700965 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:23.700933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b4hc4"] Apr 21 14:28:23.703845 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:28:23.703799 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3417a6b_baaf_488b_99aa_0815ee21a3b9.slice/crio-e8cb5bdbacad492834d0c2d9b4e9c5808509c7a5afb41c6471931cfb9054685e WatchSource:0}: Error finding container e8cb5bdbacad492834d0c2d9b4e9c5808509c7a5afb41c6471931cfb9054685e: Status 404 returned error can't find the container with id e8cb5bdbacad492834d0c2d9b4e9c5808509c7a5afb41c6471931cfb9054685e Apr 21 14:28:24.289557 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:24.289523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b4hc4" event={"ID":"b3417a6b-baaf-488b-99aa-0815ee21a3b9","Type":"ContainerStarted","Data":"e8cb5bdbacad492834d0c2d9b4e9c5808509c7a5afb41c6471931cfb9054685e"} Apr 21 14:28:25.293070 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:25.293026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f5mvw" event={"ID":"000af909-ad67-432e-8f90-beaf88b66978","Type":"ContainerStarted","Data":"fef1d20328985c07620f6e484988f2f525541cd968c32bbe7625f5c9dd43b5be"} Apr 21 14:28:25.307899 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:25.307851 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f5mvw" podStartSLOduration=135.612728628 podStartE2EDuration="2m17.307833796s" podCreationTimestamp="2026-04-21 14:26:08 +0000 UTC" firstStartedPulling="2026-04-21 14:28:22.921710072 +0000 UTC m=+167.587730703" lastFinishedPulling="2026-04-21 14:28:24.61681525 +0000 UTC m=+169.282835871" observedRunningTime="2026-04-21 14:28:25.307388918 +0000 UTC m=+169.973409583" watchObservedRunningTime="2026-04-21 14:28:25.307833796 +0000 UTC m=+169.973854444" Apr 21 14:28:25.793612 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:25.793562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:28:28.258504 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:28.258466 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x44hx" Apr 21 14:28:47.352171 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:47.352136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b4hc4" event={"ID":"b3417a6b-baaf-488b-99aa-0815ee21a3b9","Type":"ContainerStarted","Data":"22d4c8ae29d0660aa77589f8bff2af36ca4b6d567fdb5f829424bd3d6500f257"} Apr 21 14:28:47.352524 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:47.352330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:47.353670 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:47.353650 2576 patch_prober.go:28] interesting pod/downloads-6bcc868b7-b4hc4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.13:8080/\": dial tcp 10.133.0.13:8080: connect: connection refused" start-of-body= Apr 21 14:28:47.353727 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:47.353694 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-b4hc4" podUID="b3417a6b-baaf-488b-99aa-0815ee21a3b9" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.13:8080/\": dial tcp 10.133.0.13:8080: connect: connection refused" Apr 21 14:28:47.373950 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:47.373898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-b4hc4" podStartSLOduration=0.943942136 podStartE2EDuration="24.373884431s" podCreationTimestamp="2026-04-21 14:28:23 +0000 UTC" firstStartedPulling="2026-04-21 14:28:23.705972312 +0000 UTC m=+168.371992932" lastFinishedPulling="2026-04-21 14:28:47.135914604 +0000 UTC m=+191.801935227" observedRunningTime="2026-04-21 14:28:47.372051703 +0000 UTC m=+192.038072368" watchObservedRunningTime="2026-04-21 14:28:47.373884431 +0000 UTC m=+192.039905073" Apr 21 14:28:48.368452 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:48.368421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-b4hc4" Apr 21 14:28:56.447762 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:28:56.447723 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" podUID="0984855a-0ffe-408b-ba32-08b6b8ead44e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:29:06.448075 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:06.448030 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" podUID="0984855a-0ffe-408b-ba32-08b6b8ead44e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:29:16.447973 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:16.447924 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" podUID="0984855a-0ffe-408b-ba32-08b6b8ead44e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:29:16.448394 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:16.447995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" Apr 21 14:29:16.448528 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:16.448497 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ca9684902777024f8b7a4feb0c9b5611c5ab7055cc775fcd59430c274a52cbce"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 14:29:16.448577 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:16.448562 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" podUID="0984855a-0ffe-408b-ba32-08b6b8ead44e" containerName="service-proxy" containerID="cri-o://ca9684902777024f8b7a4feb0c9b5611c5ab7055cc775fcd59430c274a52cbce" gracePeriod=30 Apr 21 14:29:17.433014 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:17.432981 2576 generic.go:358] "Generic (PLEG): container finished" podID="0984855a-0ffe-408b-ba32-08b6b8ead44e" containerID="ca9684902777024f8b7a4feb0c9b5611c5ab7055cc775fcd59430c274a52cbce" exitCode=2 Apr 21 14:29:17.433249 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:17.433038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerDied","Data":"ca9684902777024f8b7a4feb0c9b5611c5ab7055cc775fcd59430c274a52cbce"} Apr 21 14:29:17.433249 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:17.433075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cb97d8b6f-dhsrg" event={"ID":"0984855a-0ffe-408b-ba32-08b6b8ead44e","Type":"ContainerStarted","Data":"61a94bd64bb075c95aaae2601ad88b97fab47d6e59102e9ca9ad1a2d34b614d7"} Apr 21 14:29:47.645661 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:47.645620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:29:47.647898 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:47.647879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbebdf2e-6e19-434f-9e3e-bb9880123092-metrics-certs\") pod \"network-metrics-daemon-5bc88\" (UID: \"bbebdf2e-6e19-434f-9e3e-bb9880123092\") " pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:29:47.696060 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:47.696028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9wlmf\"" Apr 21 14:29:47.704826 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:47.704804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bc88" Apr 21 14:29:47.840283 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:47.840249 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bc88"] Apr 21 14:29:47.843086 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:29:47.843055 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbebdf2e_6e19_434f_9e3e_bb9880123092.slice/crio-ad17aff5ac58e13b4c7805a76efa8a92fc7fb9b493aee6e6dbacd0ded31ea89c WatchSource:0}: Error finding container ad17aff5ac58e13b4c7805a76efa8a92fc7fb9b493aee6e6dbacd0ded31ea89c: Status 404 returned error can't find the container with id ad17aff5ac58e13b4c7805a76efa8a92fc7fb9b493aee6e6dbacd0ded31ea89c Apr 21 14:29:48.519229 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:48.519192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bc88" event={"ID":"bbebdf2e-6e19-434f-9e3e-bb9880123092","Type":"ContainerStarted","Data":"ad17aff5ac58e13b4c7805a76efa8a92fc7fb9b493aee6e6dbacd0ded31ea89c"} Apr 21 14:29:49.523701 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:49.523661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bc88" event={"ID":"bbebdf2e-6e19-434f-9e3e-bb9880123092","Type":"ContainerStarted","Data":"b0916af923dbb29155441be0fd07db3112880aa02d4f9a2c2534b5ab0cb8889c"} Apr 21 14:29:49.523701 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:49.523700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bc88" event={"ID":"bbebdf2e-6e19-434f-9e3e-bb9880123092","Type":"ContainerStarted","Data":"2da3385376ad81383a31f53a187e18786d55177bb5a06b0e72293a5e33d17252"} Apr 21 14:29:49.541840 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:29:49.541789 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5bc88" podStartSLOduration=253.692296676 podStartE2EDuration="4m14.541773925s" podCreationTimestamp="2026-04-21 14:25:35 +0000 UTC" firstStartedPulling="2026-04-21 14:29:47.845008047 +0000 UTC m=+252.511028667" lastFinishedPulling="2026-04-21 14:29:48.694485282 +0000 UTC m=+253.360505916" observedRunningTime="2026-04-21 14:29:49.540180013 +0000 UTC m=+254.206200669" watchObservedRunningTime="2026-04-21 14:29:49.541773925 +0000 UTC m=+254.207794581" Apr 21 14:30:35.719814 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:30:35.719787 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 14:35:48.915181 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.915138 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22"] Apr 21 14:35:48.917238 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.917220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:48.919574 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.919553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:35:48.919693 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.919621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:35:48.919693 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.919558 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:35:48.925494 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.925461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22"] Apr 21 14:35:48.971028 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.970990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:48.971221 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.971042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:48.971221 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:48.971096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sdlj\" (UniqueName: \"kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.071483 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.071443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.071650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.071500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.071650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.071541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sdlj\" (UniqueName: \"kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.071830 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.071810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.071901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.071880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.079921 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.079887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sdlj\" (UniqueName: \"kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.226290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.226205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:35:49.345291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.345259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22"] Apr 21 14:35:49.348343 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:35:49.348312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7479c98e_75a6_4336_85f3_bd578d1072ef.slice/crio-7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b WatchSource:0}: Error finding container 7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b: Status 404 returned error can't find the container with id 7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b Apr 21 14:35:49.350227 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.350212 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:35:49.437910 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:49.437871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" event={"ID":"7479c98e-75a6-4336-85f3-bd578d1072ef","Type":"ContainerStarted","Data":"7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b"} Apr 21 14:35:54.453560 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:54.453524 2576 generic.go:358] "Generic (PLEG): container finished" podID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerID="6b7475576fb6c3891a1bdd681a1733956d6aaa5800ca1d60ed57c7f42784d4d8" exitCode=0 Apr 21 14:35:54.454030 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:54.453593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" event={"ID":"7479c98e-75a6-4336-85f3-bd578d1072ef","Type":"ContainerDied","Data":"6b7475576fb6c3891a1bdd681a1733956d6aaa5800ca1d60ed57c7f42784d4d8"} Apr 21 14:35:57.463421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:57.463383 2576 generic.go:358] "Generic (PLEG): container finished" podID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerID="8f5caf9e47c90b49e3f9f20906e5c4c790f428500f1a255d48860738f384a339" exitCode=0 Apr 21 14:35:57.463908 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:35:57.463457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" event={"ID":"7479c98e-75a6-4336-85f3-bd578d1072ef","Type":"ContainerDied","Data":"8f5caf9e47c90b49e3f9f20906e5c4c790f428500f1a255d48860738f384a339"} Apr 21 14:36:03.481236 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:03.481192 2576 generic.go:358] "Generic (PLEG): container finished" podID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerID="66b24942ed59b92141f8a6394a92c76c3b3eac821fc64ab312d8bdfc9808c401" exitCode=0 Apr 21 14:36:03.481617 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:03.481270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" event={"ID":"7479c98e-75a6-4336-85f3-bd578d1072ef","Type":"ContainerDied","Data":"66b24942ed59b92141f8a6394a92c76c3b3eac821fc64ab312d8bdfc9808c401"} Apr 21 14:36:04.607949 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.607922 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:36:04.689473 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.689435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle\") pod \"7479c98e-75a6-4336-85f3-bd578d1072ef\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " Apr 21 14:36:04.689651 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.689562 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util\") pod \"7479c98e-75a6-4336-85f3-bd578d1072ef\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " Apr 21 14:36:04.689651 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.689625 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sdlj\" (UniqueName: \"kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj\") pod \"7479c98e-75a6-4336-85f3-bd578d1072ef\" (UID: \"7479c98e-75a6-4336-85f3-bd578d1072ef\") " Apr 21 14:36:04.690175 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.690148 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle" (OuterVolumeSpecName: "bundle") pod "7479c98e-75a6-4336-85f3-bd578d1072ef" (UID: "7479c98e-75a6-4336-85f3-bd578d1072ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:04.691810 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.691782 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj" (OuterVolumeSpecName: "kube-api-access-7sdlj") pod "7479c98e-75a6-4336-85f3-bd578d1072ef" (UID: "7479c98e-75a6-4336-85f3-bd578d1072ef"). InnerVolumeSpecName "kube-api-access-7sdlj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:36:04.693673 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.693655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util" (OuterVolumeSpecName: "util") pod "7479c98e-75a6-4336-85f3-bd578d1072ef" (UID: "7479c98e-75a6-4336-85f3-bd578d1072ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:04.790373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.790283 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:04.790373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.790311 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7479c98e-75a6-4336-85f3-bd578d1072ef-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:04.790373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:04.790321 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sdlj\" (UniqueName: \"kubernetes.io/projected/7479c98e-75a6-4336-85f3-bd578d1072ef-kube-api-access-7sdlj\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:05.491727 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:05.491682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" event={"ID":"7479c98e-75a6-4336-85f3-bd578d1072ef","Type":"ContainerDied","Data":"7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b"} Apr 21 14:36:05.491727 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:05.491722 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d94fbdcb22cb4972dd0d3f50d7905aa495cdadfd32f146032b10efa2b48f08b" Apr 21 14:36:05.491727 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:05.491724 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsfp22" Apr 21 14:36:12.293171 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293139 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq"] Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293378 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="extract" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293390 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="extract" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293406 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="pull" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293411 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="pull" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293420 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="util" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293426 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="util" Apr 21 14:36:12.293548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.293462 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7479c98e-75a6-4336-85f3-bd578d1072ef" containerName="extract" Apr 21 14:36:12.350017 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.349985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq"] Apr 21 14:36:12.350190 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.350095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.357561 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.357537 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:36:12.358636 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.358617 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 14:36:12.365601 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.365586 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-9jvlm\"" Apr 21 14:36:12.448347 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.448309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e241db-df89-4d97-80ea-e543e1587fe3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.448503 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.448357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drjw\" (UniqueName: \"kubernetes.io/projected/c4e241db-df89-4d97-80ea-e543e1587fe3-kube-api-access-8drjw\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.549323 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.549235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e241db-df89-4d97-80ea-e543e1587fe3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.549323 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.549280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8drjw\" (UniqueName: \"kubernetes.io/projected/c4e241db-df89-4d97-80ea-e543e1587fe3-kube-api-access-8drjw\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.549632 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.549612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e241db-df89-4d97-80ea-e543e1587fe3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.580570 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.580543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drjw\" (UniqueName: \"kubernetes.io/projected/c4e241db-df89-4d97-80ea-e543e1587fe3-kube-api-access-8drjw\") pod \"cert-manager-operator-controller-manager-54b9655956-hshnq\" (UID: \"c4e241db-df89-4d97-80ea-e543e1587fe3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.658841 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.658802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" Apr 21 14:36:12.820987 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:12.820890 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq"] Apr 21 14:36:12.825146 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:12.825102 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e241db_df89_4d97_80ea_e543e1587fe3.slice/crio-e83981879987feb5358c5860bd0c49397a910ff3472a8bb7ff82e2b878c735af WatchSource:0}: Error finding container e83981879987feb5358c5860bd0c49397a910ff3472a8bb7ff82e2b878c735af: Status 404 returned error can't find the container with id e83981879987feb5358c5860bd0c49397a910ff3472a8bb7ff82e2b878c735af Apr 21 14:36:13.513272 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:13.513237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" event={"ID":"c4e241db-df89-4d97-80ea-e543e1587fe3","Type":"ContainerStarted","Data":"e83981879987feb5358c5860bd0c49397a910ff3472a8bb7ff82e2b878c735af"} Apr 21 14:36:15.520732 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:15.520690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" event={"ID":"c4e241db-df89-4d97-80ea-e543e1587fe3","Type":"ContainerStarted","Data":"4163d8af592793f419a06aebb779c1913582d56de85f995a1de9a6b0e4e8a40c"} Apr 21 14:36:15.543341 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:15.542649 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hshnq" podStartSLOduration=1.4033067830000001 podStartE2EDuration="3.542631727s" podCreationTimestamp="2026-04-21 14:36:12 +0000 UTC" firstStartedPulling="2026-04-21 14:36:12.827591674 +0000 UTC m=+637.493612297" lastFinishedPulling="2026-04-21 14:36:14.966916606 +0000 UTC m=+639.632937241" observedRunningTime="2026-04-21 14:36:15.540738076 +0000 UTC m=+640.206758718" watchObservedRunningTime="2026-04-21 14:36:15.542631727 +0000 UTC m=+640.208652371" Apr 21 14:36:16.816868 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.816837 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k"] Apr 21 14:36:16.847866 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.847818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k"] Apr 21 14:36:16.848020 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.847991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:16.853720 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.853690 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:36:16.853720 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.853689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:36:16.854744 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.854475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:36:16.983748 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.983712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zw7\" (UniqueName: \"kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:16.983937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.983754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:16.983937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:16.983844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.084862 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.084774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.084862 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.084830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zw7\" (UniqueName: \"kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.084862 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.084851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.085220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.085199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.085289 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.085235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.106996 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.106960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zw7\" (UniqueName: \"kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.157135 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.157074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:17.296604 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.294832 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k"] Apr 21 14:36:17.299710 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:17.299674 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16b2a6d_a4f8_4150_b70e_a5e267999a46.slice/crio-9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e WatchSource:0}: Error finding container 9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e: Status 404 returned error can't find the container with id 9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e Apr 21 14:36:17.528436 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.528403 2576 generic.go:358] "Generic (PLEG): container finished" podID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerID="778de2937f8564f4f3fa2edf1e835284df0e2d9784c3c2ee42b13ac5bab39602" exitCode=0 Apr 21 14:36:17.528579 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.528488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" event={"ID":"d16b2a6d-a4f8-4150-b70e-a5e267999a46","Type":"ContainerDied","Data":"778de2937f8564f4f3fa2edf1e835284df0e2d9784c3c2ee42b13ac5bab39602"} Apr 21 14:36:17.528579 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:17.528523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" event={"ID":"d16b2a6d-a4f8-4150-b70e-a5e267999a46","Type":"ContainerStarted","Data":"9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e"} Apr 21 14:36:20.540163 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:20.540132 2576 generic.go:358] "Generic (PLEG): container finished" podID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerID="359ca2b1493ffe2a6c2c04c5e111b18892aba20bd17bebc1d28ec3ebc8dd8dc2" exitCode=0 Apr 21 14:36:20.540553 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:20.540223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" event={"ID":"d16b2a6d-a4f8-4150-b70e-a5e267999a46","Type":"ContainerDied","Data":"359ca2b1493ffe2a6c2c04c5e111b18892aba20bd17bebc1d28ec3ebc8dd8dc2"} Apr 21 14:36:21.544889 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:21.544855 2576 generic.go:358] "Generic (PLEG): container finished" podID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerID="f571571f903ddb3d1810921a447d9779c0371bdafbb8eaf7c9969aab50a91246" exitCode=0 Apr 21 14:36:21.545248 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:21.544917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" event={"ID":"d16b2a6d-a4f8-4150-b70e-a5e267999a46","Type":"ContainerDied","Data":"f571571f903ddb3d1810921a447d9779c0371bdafbb8eaf7c9969aab50a91246"} Apr 21 14:36:22.662992 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.662970 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:22.727313 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.727280 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util\") pod \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " Apr 21 14:36:22.727495 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.727324 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4zw7\" (UniqueName: \"kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7\") pod \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " Apr 21 14:36:22.727495 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.727362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle\") pod \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\" (UID: \"d16b2a6d-a4f8-4150-b70e-a5e267999a46\") " Apr 21 14:36:22.727755 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.727727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle" (OuterVolumeSpecName: "bundle") pod "d16b2a6d-a4f8-4150-b70e-a5e267999a46" (UID: "d16b2a6d-a4f8-4150-b70e-a5e267999a46"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:22.729414 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.729392 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7" (OuterVolumeSpecName: "kube-api-access-n4zw7") pod "d16b2a6d-a4f8-4150-b70e-a5e267999a46" (UID: "d16b2a6d-a4f8-4150-b70e-a5e267999a46"). InnerVolumeSpecName "kube-api-access-n4zw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:36:22.735654 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.735618 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util" (OuterVolumeSpecName: "util") pod "d16b2a6d-a4f8-4150-b70e-a5e267999a46" (UID: "d16b2a6d-a4f8-4150-b70e-a5e267999a46"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:22.828743 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.828661 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4zw7\" (UniqueName: \"kubernetes.io/projected/d16b2a6d-a4f8-4150-b70e-a5e267999a46-kube-api-access-n4zw7\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:22.828743 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.828693 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:22.828743 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:22.828702 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d16b2a6d-a4f8-4150-b70e-a5e267999a46-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:23.553593 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:23.553555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" event={"ID":"d16b2a6d-a4f8-4150-b70e-a5e267999a46","Type":"ContainerDied","Data":"9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e"} Apr 21 14:36:23.553593 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:23.553586 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pk8k" Apr 21 14:36:23.553794 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:23.553589 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f127374c183cad7b52f64f874c9355f3970446214f09e9aa28285b26ccca72e" Apr 21 14:36:29.545559 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545526 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7"] Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545770 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="extract" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545780 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="extract" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545801 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="util" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545807 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="util" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545813 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="pull" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545819 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="pull" Apr 21 14:36:29.545923 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.545858 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d16b2a6d-a4f8-4150-b70e-a5e267999a46" containerName="extract" Apr 21 14:36:29.548671 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.548645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.553745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.553721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 14:36:29.556832 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.556812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-x45gs\"" Apr 21 14:36:29.557331 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.557318 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:36:29.566427 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.566400 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7"] Apr 21 14:36:29.681136 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.681075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxn5\" (UniqueName: \"kubernetes.io/projected/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-kube-api-access-4xxn5\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.681136 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.681133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-tmp\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.782399 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.782362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxn5\" (UniqueName: \"kubernetes.io/projected/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-kube-api-access-4xxn5\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.782558 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.782406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-tmp\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.782806 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.782785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-tmp\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.795557 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.795529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxn5\" (UniqueName: \"kubernetes.io/projected/39c76ed2-fd0b-4900-b8cd-f3c36d32de46-kube-api-access-4xxn5\") pod \"openshift-lws-operator-bfc7f696d-9gxr7\" (UID: \"39c76ed2-fd0b-4900-b8cd-f3c36d32de46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.858030 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.857937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" Apr 21 14:36:29.981050 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:29.981018 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7"] Apr 21 14:36:29.985028 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:29.984995 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c76ed2_fd0b_4900_b8cd_f3c36d32de46.slice/crio-1a63090b08ab4645fb2f06e01637914a42376ff27ec1caec1d810afc2069dfd3 WatchSource:0}: Error finding container 1a63090b08ab4645fb2f06e01637914a42376ff27ec1caec1d810afc2069dfd3: Status 404 returned error can't find the container with id 1a63090b08ab4645fb2f06e01637914a42376ff27ec1caec1d810afc2069dfd3 Apr 21 14:36:30.575186 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:30.575145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" event={"ID":"39c76ed2-fd0b-4900-b8cd-f3c36d32de46","Type":"ContainerStarted","Data":"1a63090b08ab4645fb2f06e01637914a42376ff27ec1caec1d810afc2069dfd3"} Apr 21 14:36:35.591681 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:35.591648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" event={"ID":"39c76ed2-fd0b-4900-b8cd-f3c36d32de46","Type":"ContainerStarted","Data":"df7e1735fec29570f676e7f69804f80bca3dcb8643b4da2cd997abfe4b53a4f8"} Apr 21 14:36:35.607166 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:35.606992 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9gxr7" podStartSLOduration=2.032338971 podStartE2EDuration="6.606975511s" podCreationTimestamp="2026-04-21 14:36:29 +0000 UTC" firstStartedPulling="2026-04-21 14:36:29.986920541 +0000 UTC m=+654.652941163" lastFinishedPulling="2026-04-21 14:36:34.561557071 +0000 UTC m=+659.227577703" observedRunningTime="2026-04-21 14:36:35.606582902 +0000 UTC m=+660.272603639" watchObservedRunningTime="2026-04-21 14:36:35.606975511 +0000 UTC m=+660.272996153" Apr 21 14:36:36.427106 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.427067 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vss9l"] Apr 21 14:36:36.430423 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.430399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.432706 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.432677 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 14:36:36.432887 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.432866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 14:36:36.433512 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.433496 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-w2n29\"" Apr 21 14:36:36.437434 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.437406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vss9l"] Apr 21 14:36:36.533998 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.533953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v22xd\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-kube-api-access-v22xd\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.533998 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.534003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-bound-sa-token\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.635186 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.635149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-bound-sa-token\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.635581 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.635218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v22xd\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-kube-api-access-v22xd\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.645856 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.645826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v22xd\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-kube-api-access-v22xd\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.646328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.646311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00bc0e9f-ec49-4d27-8240-4a984a49ab6b-bound-sa-token\") pod \"cert-manager-79c8d999ff-vss9l\" (UID: \"00bc0e9f-ec49-4d27-8240-4a984a49ab6b\") " pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.741033 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.740939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-vss9l" Apr 21 14:36:36.860248 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:36.860221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vss9l"] Apr 21 14:36:36.862751 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:36.862724 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00bc0e9f_ec49_4d27_8240_4a984a49ab6b.slice/crio-9b4d3593ad2fc0543ee1eb76aaba25ee3e2cce82d976c6656307a7299ef82363 WatchSource:0}: Error finding container 9b4d3593ad2fc0543ee1eb76aaba25ee3e2cce82d976c6656307a7299ef82363: Status 404 returned error can't find the container with id 9b4d3593ad2fc0543ee1eb76aaba25ee3e2cce82d976c6656307a7299ef82363 Apr 21 14:36:37.599062 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:37.599027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-vss9l" event={"ID":"00bc0e9f-ec49-4d27-8240-4a984a49ab6b","Type":"ContainerStarted","Data":"9b4d3593ad2fc0543ee1eb76aaba25ee3e2cce82d976c6656307a7299ef82363"} Apr 21 14:36:39.675784 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.675748 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq"] Apr 21 14:36:39.679335 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.679311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.681978 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.681955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:36:39.682747 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.682728 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:36:39.682848 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.682758 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:36:39.688919 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.688840 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq"] Apr 21 14:36:39.763152 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.763094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.763346 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.763242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.763346 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.763300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkjr4\" (UniqueName: \"kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.864416 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.864374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkjr4\" (UniqueName: \"kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.864584 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.864469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.864584 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.864523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.864939 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.864913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.865029 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.864945 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.873680 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.873650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkjr4\" (UniqueName: \"kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:39.991384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:39.991302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:40.292378 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.292343 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq"] Apr 21 14:36:40.295267 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:40.295224 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412f804d_121a_4b2a_bb5c_e6fc3165c46a.slice/crio-d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833 WatchSource:0}: Error finding container d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833: Status 404 returned error can't find the container with id d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833 Apr 21 14:36:40.616620 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.616538 2576 generic.go:358] "Generic (PLEG): container finished" podID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerID="8e8f6aefdb45355158129e5b51865c149e62421fda4031ff220b9905b5fec5e3" exitCode=0 Apr 21 14:36:40.616779 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.616622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" event={"ID":"412f804d-121a-4b2a-bb5c-e6fc3165c46a","Type":"ContainerDied","Data":"8e8f6aefdb45355158129e5b51865c149e62421fda4031ff220b9905b5fec5e3"} Apr 21 14:36:40.616779 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.616661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" event={"ID":"412f804d-121a-4b2a-bb5c-e6fc3165c46a","Type":"ContainerStarted","Data":"d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833"} Apr 21 14:36:40.617991 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.617968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-vss9l" event={"ID":"00bc0e9f-ec49-4d27-8240-4a984a49ab6b","Type":"ContainerStarted","Data":"abf0a3dedb0b6c2957582f62225549b533337c8da8b761d59b70d761f4bbfcdb"} Apr 21 14:36:40.655048 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:40.654994 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-vss9l" podStartSLOduration=1.284369621 podStartE2EDuration="4.654980436s" podCreationTimestamp="2026-04-21 14:36:36 +0000 UTC" firstStartedPulling="2026-04-21 14:36:36.864938159 +0000 UTC m=+661.530958781" lastFinishedPulling="2026-04-21 14:36:40.235548962 +0000 UTC m=+664.901569596" observedRunningTime="2026-04-21 14:36:40.653696533 +0000 UTC m=+665.319717197" watchObservedRunningTime="2026-04-21 14:36:40.654980436 +0000 UTC m=+665.321001077" Apr 21 14:36:41.624413 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:41.624333 2576 generic.go:358] "Generic (PLEG): container finished" podID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerID="16d80062967cece3fb97af5c5d280e663a3fc783101f783f889bf771e36cdac3" exitCode=0 Apr 21 14:36:41.624413 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:41.624398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" event={"ID":"412f804d-121a-4b2a-bb5c-e6fc3165c46a","Type":"ContainerDied","Data":"16d80062967cece3fb97af5c5d280e663a3fc783101f783f889bf771e36cdac3"} Apr 21 14:36:42.628976 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:42.628944 2576 generic.go:358] "Generic (PLEG): container finished" podID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerID="39ff53a1af0157e4fa443eea2a5f9969eecbebe0b6f2adc7907b10dec503b35b" exitCode=0 Apr 21 14:36:42.629435 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:42.629024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" event={"ID":"412f804d-121a-4b2a-bb5c-e6fc3165c46a","Type":"ContainerDied","Data":"39ff53a1af0157e4fa443eea2a5f9969eecbebe0b6f2adc7907b10dec503b35b"} Apr 21 14:36:43.750658 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.750635 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:43.896150 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.896098 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle\") pod \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " Apr 21 14:36:43.896308 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.896158 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkjr4\" (UniqueName: \"kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4\") pod \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " Apr 21 14:36:43.896308 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.896217 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util\") pod \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\" (UID: \"412f804d-121a-4b2a-bb5c-e6fc3165c46a\") " Apr 21 14:36:43.896787 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.896761 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle" (OuterVolumeSpecName: "bundle") pod "412f804d-121a-4b2a-bb5c-e6fc3165c46a" (UID: "412f804d-121a-4b2a-bb5c-e6fc3165c46a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:43.898246 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.898220 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4" (OuterVolumeSpecName: "kube-api-access-pkjr4") pod "412f804d-121a-4b2a-bb5c-e6fc3165c46a" (UID: "412f804d-121a-4b2a-bb5c-e6fc3165c46a"). InnerVolumeSpecName "kube-api-access-pkjr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:36:43.901598 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.901562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util" (OuterVolumeSpecName: "util") pod "412f804d-121a-4b2a-bb5c-e6fc3165c46a" (UID: "412f804d-121a-4b2a-bb5c-e6fc3165c46a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:43.996687 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.996656 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:43.996687 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.996683 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412f804d-121a-4b2a-bb5c-e6fc3165c46a-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:43.996687 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:43.996693 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pkjr4\" (UniqueName: \"kubernetes.io/projected/412f804d-121a-4b2a-bb5c-e6fc3165c46a-kube-api-access-pkjr4\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:44.636462 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:44.636420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" event={"ID":"412f804d-121a-4b2a-bb5c-e6fc3165c46a","Type":"ContainerDied","Data":"d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833"} Apr 21 14:36:44.636462 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:44.636441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xprpq" Apr 21 14:36:44.636462 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:44.636457 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d843b902b06485f3140dddecee407d16d33e4b14b78919f840d92003a0af3833" Apr 21 14:36:48.882100 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr"] Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882461 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="util" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882481 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="util" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882493 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="extract" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882512 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="extract" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882542 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="pull" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882551 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="pull" Apr 21 14:36:48.883018 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.882626 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="412f804d-121a-4b2a-bb5c-e6fc3165c46a" containerName="extract" Apr 21 14:36:48.889236 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.889192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:48.892982 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.891701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:36:48.892982 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.892066 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:36:48.892982 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.892620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:36:48.901076 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:48.901046 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr"] Apr 21 14:36:49.035287 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.035249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.035468 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.035301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.035468 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.035338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbp9\" (UniqueName: \"kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.136326 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.136227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.136326 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.136317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbp9\" (UniqueName: \"kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.136507 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.136355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.136667 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.136647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.136702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.136649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.147865 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.147833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbp9\" (UniqueName: \"kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.202511 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.202479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:49.352679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.352655 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr"] Apr 21 14:36:49.355002 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:49.354971 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4afdf4_1d0e_48fb_b086_9dd2334293bf.slice/crio-558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f WatchSource:0}: Error finding container 558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f: Status 404 returned error can't find the container with id 558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f Apr 21 14:36:49.653633 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.653596 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerID="6f3ab081aef5b2078b4c6fb20087c130790b9c7e6330f1169acf6d9d43bbb9f3" exitCode=0 Apr 21 14:36:49.653814 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.653685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" event={"ID":"5b4afdf4-1d0e-48fb-b086-9dd2334293bf","Type":"ContainerDied","Data":"6f3ab081aef5b2078b4c6fb20087c130790b9c7e6330f1169acf6d9d43bbb9f3"} Apr 21 14:36:49.653814 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:49.653722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" event={"ID":"5b4afdf4-1d0e-48fb-b086-9dd2334293bf","Type":"ContainerStarted","Data":"558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f"} Apr 21 14:36:50.658557 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.658526 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerID="4fa4e9b05c44827e27ad36549aa6e45e0565952dadb48f809ea2db3938e03c76" exitCode=0 Apr 21 14:36:50.658938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.658586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" event={"ID":"5b4afdf4-1d0e-48fb-b086-9dd2334293bf","Type":"ContainerDied","Data":"4fa4e9b05c44827e27ad36549aa6e45e0565952dadb48f809ea2db3938e03c76"} Apr 21 14:36:50.747964 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.747929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd"] Apr 21 14:36:50.754398 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.754371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.757552 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.757523 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 14:36:50.757685 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.757598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 14:36:50.757685 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.757605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 14:36:50.757685 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.757647 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-dt2r5\"" Apr 21 14:36:50.757878 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.757860 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 14:36:50.772567 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.772542 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd"] Apr 21 14:36:50.850594 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.850567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.850773 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.850607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xbd\" (UniqueName: \"kubernetes.io/projected/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-kube-api-access-84xbd\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.850773 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.850733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.952007 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.951921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.952007 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.951979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84xbd\" (UniqueName: \"kubernetes.io/projected/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-kube-api-access-84xbd\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.952198 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.952041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.954461 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.954432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.954580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.954444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:50.967220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:50.967201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xbd\" (UniqueName: \"kubernetes.io/projected/d06ff8e8-a678-4df5-a145-0b0b2b8a94f2-kube-api-access-84xbd\") pod \"opendatahub-operator-controller-manager-7df645bd74-qg9dd\" (UID: \"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:51.065404 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:51.065370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:51.200841 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:51.200812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd"] Apr 21 14:36:51.202955 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:36:51.202907 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06ff8e8_a678_4df5_a145_0b0b2b8a94f2.slice/crio-e4078776b7c06e34a773e0c63a8bab1386eef43424f77d678c56871560f25c96 WatchSource:0}: Error finding container e4078776b7c06e34a773e0c63a8bab1386eef43424f77d678c56871560f25c96: Status 404 returned error can't find the container with id e4078776b7c06e34a773e0c63a8bab1386eef43424f77d678c56871560f25c96 Apr 21 14:36:51.666530 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:51.666496 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerID="1633f6f6012100748c9daa0bbcd7532ffd2eddfaae936a5b33213ebe6df1ca56" exitCode=0 Apr 21 14:36:51.666949 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:51.666585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" event={"ID":"5b4afdf4-1d0e-48fb-b086-9dd2334293bf","Type":"ContainerDied","Data":"1633f6f6012100748c9daa0bbcd7532ffd2eddfaae936a5b33213ebe6df1ca56"} Apr 21 14:36:51.667696 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:51.667673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" event={"ID":"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2","Type":"ContainerStarted","Data":"e4078776b7c06e34a773e0c63a8bab1386eef43424f77d678c56871560f25c96"} Apr 21 14:36:53.792187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.792160 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:53.975938 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.975910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle\") pod \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " Apr 21 14:36:53.976131 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.975956 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbp9\" (UniqueName: \"kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9\") pod \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " Apr 21 14:36:53.976131 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.976029 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util\") pod \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\" (UID: \"5b4afdf4-1d0e-48fb-b086-9dd2334293bf\") " Apr 21 14:36:53.976834 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.976799 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle" (OuterVolumeSpecName: "bundle") pod "5b4afdf4-1d0e-48fb-b086-9dd2334293bf" (UID: "5b4afdf4-1d0e-48fb-b086-9dd2334293bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:53.978293 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.978267 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9" (OuterVolumeSpecName: "kube-api-access-cvbp9") pod "5b4afdf4-1d0e-48fb-b086-9dd2334293bf" (UID: "5b4afdf4-1d0e-48fb-b086-9dd2334293bf"). InnerVolumeSpecName "kube-api-access-cvbp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:36:53.982927 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:53.982899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util" (OuterVolumeSpecName: "util") pod "5b4afdf4-1d0e-48fb-b086-9dd2334293bf" (UID: "5b4afdf4-1d0e-48fb-b086-9dd2334293bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:36:54.076745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.076663 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:54.076745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.076692 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:54.076745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.076706 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvbp9\" (UniqueName: \"kubernetes.io/projected/5b4afdf4-1d0e-48fb-b086-9dd2334293bf-kube-api-access-cvbp9\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:36:54.678187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.678145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" event={"ID":"d06ff8e8-a678-4df5-a145-0b0b2b8a94f2","Type":"ContainerStarted","Data":"8533038157e10a1af9ad15c9efe39cd8c9620bc98e281da95b461924bb39d1af"} Apr 21 14:36:54.678377 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.678264 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:36:54.679545 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.679519 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" Apr 21 14:36:54.679662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.679524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6vkr" event={"ID":"5b4afdf4-1d0e-48fb-b086-9dd2334293bf","Type":"ContainerDied","Data":"558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f"} Apr 21 14:36:54.679662 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.679620 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="558ba8d7ed4896a05260489f555db6662148e3c2b65a099140647891057a932f" Apr 21 14:36:54.715052 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:36:54.714983 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" podStartSLOduration=2.093546517 podStartE2EDuration="4.714965989s" podCreationTimestamp="2026-04-21 14:36:50 +0000 UTC" firstStartedPulling="2026-04-21 14:36:51.205294224 +0000 UTC m=+675.871314843" lastFinishedPulling="2026-04-21 14:36:53.826713685 +0000 UTC m=+678.492734315" observedRunningTime="2026-04-21 14:36:54.713001051 +0000 UTC m=+679.379021693" watchObservedRunningTime="2026-04-21 14:36:54.714965989 +0000 UTC m=+679.380986612" Apr 21 14:37:05.685130 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:05.685091 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-qg9dd" Apr 21 14:37:08.168593 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168557 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr"] Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168842 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="util" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168854 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="util" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168866 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="pull" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168871 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="pull" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="extract" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168887 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="extract" Apr 21 14:37:08.168966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.168935 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b4afdf4-1d0e-48fb-b086-9dd2334293bf" containerName="extract" Apr 21 14:37:08.173369 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.173349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.176142 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.176105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:37:08.176755 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.176741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:37:08.177202 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.177186 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:37:08.200992 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.200951 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr"] Apr 21 14:37:08.282243 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.282210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.282420 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.282258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.282420 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.282320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrbk\" (UniqueName: \"kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.382692 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.382649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.382692 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.382695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrbk\" (UniqueName: \"kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.382901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.382744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.383063 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.383042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.383134 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.383066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.392481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.392451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrbk\" (UniqueName: \"kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.486328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.486243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:08.625559 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.625526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr"] Apr 21 14:37:08.630106 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:08.630078 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923c95f6_8aa4_4302_9689_f0456c59e91d.slice/crio-3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34 WatchSource:0}: Error finding container 3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34: Status 404 returned error can't find the container with id 3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34 Apr 21 14:37:08.726762 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.726722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerStarted","Data":"02670ee849ff04eed9256c2f49cbbe4f140a365625517aacbaed5dcf865cdbed"} Apr 21 14:37:08.726762 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:08.726759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerStarted","Data":"3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34"} Apr 21 14:37:09.476499 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.476465 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls"] Apr 21 14:37:09.480323 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.480303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.491190 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.491170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 14:37:09.494999 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.494978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 14:37:09.495061 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.494978 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 14:37:09.495191 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.495175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-b2zpf\"" Apr 21 14:37:09.495419 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.495406 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 14:37:09.520947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.520921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls"] Apr 21 14:37:09.589588 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.589552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.589588 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.589587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.589758 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.589624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrkj\" (UniqueName: \"kubernetes.io/projected/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-kube-api-access-tvrkj\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.690351 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.690254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.690351 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.690296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.690351 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.690334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrkj\" (UniqueName: \"kubernetes.io/projected/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-kube-api-access-tvrkj\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.692681 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.692651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.692882 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.692863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.704409 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.704385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrkj\" (UniqueName: \"kubernetes.io/projected/66bb68fe-aa1c-46cd-b3cb-0afd155918f5-kube-api-access-tvrkj\") pod \"kube-auth-proxy-7b77d4bb4f-5d8ls\" (UID: \"66bb68fe-aa1c-46cd-b3cb-0afd155918f5\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.731472 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.731441 2576 generic.go:358] "Generic (PLEG): container finished" podID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerID="02670ee849ff04eed9256c2f49cbbe4f140a365625517aacbaed5dcf865cdbed" exitCode=0 Apr 21 14:37:09.731637 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.731531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerDied","Data":"02670ee849ff04eed9256c2f49cbbe4f140a365625517aacbaed5dcf865cdbed"} Apr 21 14:37:09.789220 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.789186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" Apr 21 14:37:09.905452 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:09.905426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls"] Apr 21 14:37:09.908005 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:09.907957 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66bb68fe_aa1c_46cd_b3cb_0afd155918f5.slice/crio-61846f797ed2c007334e3b06561822d9e08c46c42c04643b8b4ad808739e5c19 WatchSource:0}: Error finding container 61846f797ed2c007334e3b06561822d9e08c46c42c04643b8b4ad808739e5c19: Status 404 returned error can't find the container with id 61846f797ed2c007334e3b06561822d9e08c46c42c04643b8b4ad808739e5c19 Apr 21 14:37:10.735995 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:10.735953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" event={"ID":"66bb68fe-aa1c-46cd-b3cb-0afd155918f5","Type":"ContainerStarted","Data":"61846f797ed2c007334e3b06561822d9e08c46c42c04643b8b4ad808739e5c19"} Apr 21 14:37:11.741233 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:11.741195 2576 generic.go:358] "Generic (PLEG): container finished" podID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerID="2f93d286dc714f7b6f956c7b932f96f63fd2869378dab1990145355c680192b6" exitCode=0 Apr 21 14:37:11.741631 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:11.741237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerDied","Data":"2f93d286dc714f7b6f956c7b932f96f63fd2869378dab1990145355c680192b6"} Apr 21 14:37:12.749423 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:12.749387 2576 generic.go:358] "Generic (PLEG): container finished" podID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerID="4616996e29154f6a9b51c073d2c8dc33c1fc31f48a8bd6fa5874a74f25af613f" exitCode=0 Apr 21 14:37:12.749864 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:12.749468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerDied","Data":"4616996e29154f6a9b51c073d2c8dc33c1fc31f48a8bd6fa5874a74f25af613f"} Apr 21 14:37:13.208147 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.208097 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c8mtf"] Apr 21 14:37:13.211367 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.211351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.220396 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.220377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 14:37:13.220537 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.220433 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-s7h5l\"" Apr 21 14:37:13.240979 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.240953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c8mtf"] Apr 21 14:37:13.320883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.320837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.321041 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.320910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87xjh\" (UniqueName: \"kubernetes.io/projected/9307725b-46ef-4197-880a-edc49567bde2-kube-api-access-87xjh\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.421345 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.421313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87xjh\" (UniqueName: \"kubernetes.io/projected/9307725b-46ef-4197-880a-edc49567bde2-kube-api-access-87xjh\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.421535 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.421375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.421535 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:13.421481 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 14:37:13.421645 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:13.421542 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert podName:9307725b-46ef-4197-880a-edc49567bde2 nodeName:}" failed. No retries permitted until 2026-04-21 14:37:13.921521787 +0000 UTC m=+698.587542407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert") pod "odh-model-controller-858dbf95b8-c8mtf" (UID: "9307725b-46ef-4197-880a-edc49567bde2") : secret "odh-model-controller-webhook-cert" not found Apr 21 14:37:13.429673 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.429646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87xjh\" (UniqueName: \"kubernetes.io/projected/9307725b-46ef-4197-880a-edc49567bde2-kube-api-access-87xjh\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.925413 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:13.925375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:13.925819 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:13.925507 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 14:37:13.925819 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:13.925588 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert podName:9307725b-46ef-4197-880a-edc49567bde2 nodeName:}" failed. No retries permitted until 2026-04-21 14:37:14.925565429 +0000 UTC m=+699.591586075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert") pod "odh-model-controller-858dbf95b8-c8mtf" (UID: "9307725b-46ef-4197-880a-edc49567bde2") : secret "odh-model-controller-webhook-cert" not found Apr 21 14:37:14.403333 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.403309 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:14.530128 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.530071 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrbk\" (UniqueName: \"kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk\") pod \"923c95f6-8aa4-4302-9689-f0456c59e91d\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " Apr 21 14:37:14.530276 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.530160 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle\") pod \"923c95f6-8aa4-4302-9689-f0456c59e91d\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " Apr 21 14:37:14.530276 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.530265 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util\") pod \"923c95f6-8aa4-4302-9689-f0456c59e91d\" (UID: \"923c95f6-8aa4-4302-9689-f0456c59e91d\") " Apr 21 14:37:14.531008 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.530980 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle" (OuterVolumeSpecName: "bundle") pod "923c95f6-8aa4-4302-9689-f0456c59e91d" (UID: "923c95f6-8aa4-4302-9689-f0456c59e91d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:37:14.532135 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.532095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk" (OuterVolumeSpecName: "kube-api-access-rvrbk") pod "923c95f6-8aa4-4302-9689-f0456c59e91d" (UID: "923c95f6-8aa4-4302-9689-f0456c59e91d"). InnerVolumeSpecName "kube-api-access-rvrbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:37:14.534805 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.534784 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util" (OuterVolumeSpecName: "util") pod "923c95f6-8aa4-4302-9689-f0456c59e91d" (UID: "923c95f6-8aa4-4302-9689-f0456c59e91d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:37:14.631690 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.631639 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvrbk\" (UniqueName: \"kubernetes.io/projected/923c95f6-8aa4-4302-9689-f0456c59e91d-kube-api-access-rvrbk\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:14.631690 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.631685 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:14.631690 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.631697 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923c95f6-8aa4-4302-9689-f0456c59e91d-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:14.757760 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.757723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" event={"ID":"923c95f6-8aa4-4302-9689-f0456c59e91d","Type":"ContainerDied","Data":"3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34"} Apr 21 14:37:14.757760 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.757753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t8cbr" Apr 21 14:37:14.757990 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.757758 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3548657db5737a080315c2b57d3f0c63352162e48d64d4d60172586acbaddb34" Apr 21 14:37:14.759156 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.759103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" event={"ID":"66bb68fe-aa1c-46cd-b3cb-0afd155918f5","Type":"ContainerStarted","Data":"8d28c37f23f42a57b8fa20de2eb6732f9b4c3bf61fb45a4c8bb91d9aac6ca5e8"} Apr 21 14:37:14.789152 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.789087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5d8ls" podStartSLOduration=1.254535843 podStartE2EDuration="5.789069807s" podCreationTimestamp="2026-04-21 14:37:09 +0000 UTC" firstStartedPulling="2026-04-21 14:37:09.909623396 +0000 UTC m=+694.575644015" lastFinishedPulling="2026-04-21 14:37:14.444157349 +0000 UTC m=+699.110177979" observedRunningTime="2026-04-21 14:37:14.787492557 +0000 UTC m=+699.453513198" watchObservedRunningTime="2026-04-21 14:37:14.789069807 +0000 UTC m=+699.455090463" Apr 21 14:37:14.934584 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.934546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:14.936992 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:14.936971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9307725b-46ef-4197-880a-edc49567bde2-cert\") pod \"odh-model-controller-858dbf95b8-c8mtf\" (UID: \"9307725b-46ef-4197-880a-edc49567bde2\") " pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:15.020990 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:15.020899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:15.142960 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:15.142802 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c8mtf"] Apr 21 14:37:15.145495 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:15.145463 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9307725b_46ef_4197_880a_edc49567bde2.slice/crio-9cdd5d88fb463fe94c20b64c03e90382922a6f298d4be15eaa86fee97afa8594 WatchSource:0}: Error finding container 9cdd5d88fb463fe94c20b64c03e90382922a6f298d4be15eaa86fee97afa8594: Status 404 returned error can't find the container with id 9cdd5d88fb463fe94c20b64c03e90382922a6f298d4be15eaa86fee97afa8594 Apr 21 14:37:15.766552 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:15.766511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" event={"ID":"9307725b-46ef-4197-880a-edc49567bde2","Type":"ContainerStarted","Data":"9cdd5d88fb463fe94c20b64c03e90382922a6f298d4be15eaa86fee97afa8594"} Apr 21 14:37:18.779695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:18.779659 2576 generic.go:358] "Generic (PLEG): container finished" podID="9307725b-46ef-4197-880a-edc49567bde2" containerID="de57c0891e4de30038dc459282b4a2670646f55eeb8e036357be16c570923a05" exitCode=1 Apr 21 14:37:18.780071 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:18.779707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" event={"ID":"9307725b-46ef-4197-880a-edc49567bde2","Type":"ContainerDied","Data":"de57c0891e4de30038dc459282b4a2670646f55eeb8e036357be16c570923a05"} Apr 21 14:37:18.780071 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:18.779922 2576 scope.go:117] "RemoveContainer" containerID="de57c0891e4de30038dc459282b4a2670646f55eeb8e036357be16c570923a05" Apr 21 14:37:19.039210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.039095 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ckrj2"] Apr 21 14:37:19.039913 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.039891 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="util" Apr 21 14:37:19.040073 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040059 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="util" Apr 21 14:37:19.040210 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040197 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="pull" Apr 21 14:37:19.040331 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040318 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="pull" Apr 21 14:37:19.040432 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040422 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="extract" Apr 21 14:37:19.040514 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040505 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="extract" Apr 21 14:37:19.040708 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.040696 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="923c95f6-8aa4-4302-9689-f0456c59e91d" containerName="extract" Apr 21 14:37:19.043853 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.043831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.047498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.047477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 14:37:19.047691 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.047672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-b24wb\"" Apr 21 14:37:19.057546 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.057524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ckrj2"] Apr 21 14:37:19.167892 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.167855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc4e480a-1a65-41e7-85b5-31e172a4dbca-cert\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.168058 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.168001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dpd\" (UniqueName: \"kubernetes.io/projected/bc4e480a-1a65-41e7-85b5-31e172a4dbca-kube-api-access-47dpd\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.269159 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.269129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc4e480a-1a65-41e7-85b5-31e172a4dbca-cert\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.269277 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.269260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47dpd\" (UniqueName: \"kubernetes.io/projected/bc4e480a-1a65-41e7-85b5-31e172a4dbca-kube-api-access-47dpd\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.271443 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.271419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc4e480a-1a65-41e7-85b5-31e172a4dbca-cert\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.287332 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.287306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dpd\" (UniqueName: \"kubernetes.io/projected/bc4e480a-1a65-41e7-85b5-31e172a4dbca-kube-api-access-47dpd\") pod \"kserve-controller-manager-856948b99f-ckrj2\" (UID: \"bc4e480a-1a65-41e7-85b5-31e172a4dbca\") " pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.354364 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.354278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:19.496042 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.496016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ckrj2"] Apr 21 14:37:19.497861 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:19.497829 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4e480a_1a65_41e7_85b5_31e172a4dbca.slice/crio-9b9eb45640c06fd88a0d39447808616e4dbb4a4606f13b7e779c7758cedf443c WatchSource:0}: Error finding container 9b9eb45640c06fd88a0d39447808616e4dbb4a4606f13b7e779c7758cedf443c: Status 404 returned error can't find the container with id 9b9eb45640c06fd88a0d39447808616e4dbb4a4606f13b7e779c7758cedf443c Apr 21 14:37:19.783704 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.783669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" event={"ID":"bc4e480a-1a65-41e7-85b5-31e172a4dbca","Type":"ContainerStarted","Data":"9b9eb45640c06fd88a0d39447808616e4dbb4a4606f13b7e779c7758cedf443c"} Apr 21 14:37:19.785103 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.785069 2576 generic.go:358] "Generic (PLEG): container finished" podID="9307725b-46ef-4197-880a-edc49567bde2" containerID="efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417" exitCode=1 Apr 21 14:37:19.785262 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.785106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" event={"ID":"9307725b-46ef-4197-880a-edc49567bde2","Type":"ContainerDied","Data":"efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417"} Apr 21 14:37:19.785262 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.785171 2576 scope.go:117] "RemoveContainer" containerID="de57c0891e4de30038dc459282b4a2670646f55eeb8e036357be16c570923a05" Apr 21 14:37:19.785395 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:19.785378 2576 scope.go:117] "RemoveContainer" containerID="efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417" Apr 21 14:37:19.785612 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:19.785591 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c8mtf_opendatahub(9307725b-46ef-4197-880a-edc49567bde2)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" podUID="9307725b-46ef-4197-880a-edc49567bde2" Apr 21 14:37:20.789874 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:20.789841 2576 scope.go:117] "RemoveContainer" containerID="efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417" Apr 21 14:37:20.790301 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:20.790002 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c8mtf_opendatahub(9307725b-46ef-4197-880a-edc49567bde2)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" podUID="9307725b-46ef-4197-880a-edc49567bde2" Apr 21 14:37:22.797577 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:22.797538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" event={"ID":"bc4e480a-1a65-41e7-85b5-31e172a4dbca","Type":"ContainerStarted","Data":"b94dc3d61fbe2b0da2cff22292dade60feac96c588ed142b12240aa64fe508f5"} Apr 21 14:37:22.797973 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:22.797648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:37:22.919043 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:22.914824 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" podStartSLOduration=1.092070815 podStartE2EDuration="3.914804755s" podCreationTimestamp="2026-04-21 14:37:19 +0000 UTC" firstStartedPulling="2026-04-21 14:37:19.499200196 +0000 UTC m=+704.165220816" lastFinishedPulling="2026-04-21 14:37:22.321934131 +0000 UTC m=+706.987954756" observedRunningTime="2026-04-21 14:37:22.912632119 +0000 UTC m=+707.578652761" watchObservedRunningTime="2026-04-21 14:37:22.914804755 +0000 UTC m=+707.580825397" Apr 21 14:37:23.533237 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.533199 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb"] Apr 21 14:37:23.536702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.536683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.572886 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.572853 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:37:23.573051 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.572943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:37:23.574991 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.574965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:37:23.598352 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.598324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb"] Apr 21 14:37:23.608275 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.608241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6278\" (UniqueName: \"kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.608418 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.608303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.608418 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.608346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.709486 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.709448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.709683 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.709493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.709683 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.709568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6278\" (UniqueName: \"kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.709846 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.709824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.709909 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.709847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.747429 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.747388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6278\" (UniqueName: \"kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.845922 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.845843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:23.988099 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:23.988066 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb"] Apr 21 14:37:23.991308 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:23.991272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f1fe0b2_90b7_4692_86eb_4ff302d25972.slice/crio-76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601 WatchSource:0}: Error finding container 76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601: Status 404 returned error can't find the container with id 76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601 Apr 21 14:37:24.806677 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:24.806636 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerID="94d03a8ee38179d5d29b04a113bf2d59f9658ab8e79cc953a821fd11ba8d3879" exitCode=0 Apr 21 14:37:24.806839 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:24.806703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" event={"ID":"0f1fe0b2-90b7-4692-86eb-4ff302d25972","Type":"ContainerDied","Data":"94d03a8ee38179d5d29b04a113bf2d59f9658ab8e79cc953a821fd11ba8d3879"} Apr 21 14:37:24.806839 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:24.806741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" event={"ID":"0f1fe0b2-90b7-4692-86eb-4ff302d25972","Type":"ContainerStarted","Data":"76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601"} Apr 21 14:37:25.021373 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.021282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:25.021969 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.021766 2576 scope.go:117] "RemoveContainer" containerID="efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417" Apr 21 14:37:25.022045 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:37:25.021992 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c8mtf_opendatahub(9307725b-46ef-4197-880a-edc49567bde2)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" podUID="9307725b-46ef-4197-880a-edc49567bde2" Apr 21 14:37:25.250421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.250385 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9"] Apr 21 14:37:25.253853 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.253830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.256497 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.256470 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 14:37:25.256626 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.256480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-9d899\"" Apr 21 14:37:25.256718 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.256703 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 14:37:25.265732 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.265709 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9"] Apr 21 14:37:25.323235 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.323154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.323235 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.323185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsrq\" (UniqueName: \"kubernetes.io/projected/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-kube-api-access-rnsrq\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.424183 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.424146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.424183 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.424184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsrq\" (UniqueName: \"kubernetes.io/projected/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-kube-api-access-rnsrq\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.426687 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.426654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.438529 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.438502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsrq\" (UniqueName: \"kubernetes.io/projected/fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef-kube-api-access-rnsrq\") pod \"servicemesh-operator3-55f49c5f94-v7bw9\" (UID: \"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.563621 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.563576 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:25.707498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.707469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9"] Apr 21 14:37:25.710087 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:25.710059 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa237c19_a2aa_4a8f_99aa_a1f9fe94e8ef.slice/crio-184278e45cbddbd3575a55afeb410e37e966da15411bc3d176d06e6ce3c7332d WatchSource:0}: Error finding container 184278e45cbddbd3575a55afeb410e37e966da15411bc3d176d06e6ce3c7332d: Status 404 returned error can't find the container with id 184278e45cbddbd3575a55afeb410e37e966da15411bc3d176d06e6ce3c7332d Apr 21 14:37:25.810965 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:25.810929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" event={"ID":"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef","Type":"ContainerStarted","Data":"184278e45cbddbd3575a55afeb410e37e966da15411bc3d176d06e6ce3c7332d"} Apr 21 14:37:26.816843 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:26.816803 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerID="fc9b5246f9e1e776da32b1724836bb0246a53e11f3d0e3fc144f8736f2caf5b8" exitCode=0 Apr 21 14:37:26.817303 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:26.816846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" event={"ID":"0f1fe0b2-90b7-4692-86eb-4ff302d25972","Type":"ContainerDied","Data":"fc9b5246f9e1e776da32b1724836bb0246a53e11f3d0e3fc144f8736f2caf5b8"} Apr 21 14:37:27.823461 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:27.823424 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerID="9beb7657e95e2b9d02d0d5cd82732e8e0fcc767e682aeeb78157ec3c2c0d601c" exitCode=0 Apr 21 14:37:27.823904 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:27.823509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" event={"ID":"0f1fe0b2-90b7-4692-86eb-4ff302d25972","Type":"ContainerDied","Data":"9beb7657e95e2b9d02d0d5cd82732e8e0fcc767e682aeeb78157ec3c2c0d601c"} Apr 21 14:37:28.828600 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:28.828505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" event={"ID":"fa237c19-a2aa-4a8f-99aa-a1f9fe94e8ef","Type":"ContainerStarted","Data":"49e3d4c91851e45cfcf52b377bf189bfd62ebe8fdac4bf6b6b7e99556f2eca68"} Apr 21 14:37:28.828600 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:28.828567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:28.876722 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:28.876657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" podStartSLOduration=1.062260985 podStartE2EDuration="3.876643598s" podCreationTimestamp="2026-04-21 14:37:25 +0000 UTC" firstStartedPulling="2026-04-21 14:37:25.712517479 +0000 UTC m=+710.378538100" lastFinishedPulling="2026-04-21 14:37:28.526900093 +0000 UTC m=+713.192920713" observedRunningTime="2026-04-21 14:37:28.875387701 +0000 UTC m=+713.541408355" watchObservedRunningTime="2026-04-21 14:37:28.876643598 +0000 UTC m=+713.542664239" Apr 21 14:37:28.959044 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:28.959020 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:29.055437 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.055406 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6278\" (UniqueName: \"kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278\") pod \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " Apr 21 14:37:29.055605 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.055474 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle\") pod \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " Apr 21 14:37:29.055605 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.055497 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util\") pod \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\" (UID: \"0f1fe0b2-90b7-4692-86eb-4ff302d25972\") " Apr 21 14:37:29.056669 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.056633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle" (OuterVolumeSpecName: "bundle") pod "0f1fe0b2-90b7-4692-86eb-4ff302d25972" (UID: "0f1fe0b2-90b7-4692-86eb-4ff302d25972"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:37:29.057554 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.057526 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278" (OuterVolumeSpecName: "kube-api-access-b6278") pod "0f1fe0b2-90b7-4692-86eb-4ff302d25972" (UID: "0f1fe0b2-90b7-4692-86eb-4ff302d25972"). InnerVolumeSpecName "kube-api-access-b6278". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:37:29.060447 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.060410 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util" (OuterVolumeSpecName: "util") pod "0f1fe0b2-90b7-4692-86eb-4ff302d25972" (UID: "0f1fe0b2-90b7-4692-86eb-4ff302d25972"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:37:29.156751 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.156722 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:29.156751 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.156750 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6278\" (UniqueName: \"kubernetes.io/projected/0f1fe0b2-90b7-4692-86eb-4ff302d25972-kube-api-access-b6278\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:29.156928 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.156760 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f1fe0b2-90b7-4692-86eb-4ff302d25972-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:37:29.833087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.833045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" event={"ID":"0f1fe0b2-90b7-4692-86eb-4ff302d25972","Type":"ContainerDied","Data":"76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601"} Apr 21 14:37:29.833087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.833086 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ed8e556829f07d046b6dc85c1772f4f8bb6518a59b4b085b6e1f506a3fd601" Apr 21 14:37:29.833475 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:29.833086 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c244rsb" Apr 21 14:37:35.021177 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:35.021107 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:35.021582 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:35.021518 2576 scope.go:117] "RemoveContainer" containerID="efa99762fb738ba9e23b80d7df6ac87f97774585b37135b8581ca9783710f417" Apr 21 14:37:35.857836 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:35.857801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" event={"ID":"9307725b-46ef-4197-880a-edc49567bde2","Type":"ContainerStarted","Data":"95f7faf55dd8c4865c17fc143f8a2a54b7c11b05b24bd69657a0ace41e766339"} Apr 21 14:37:35.858028 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:35.858010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:39.835615 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:39.835581 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v7bw9" Apr 21 14:37:39.856708 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:39.856646 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" podStartSLOduration=6.630796917 podStartE2EDuration="26.856627666s" podCreationTimestamp="2026-04-21 14:37:13 +0000 UTC" firstStartedPulling="2026-04-21 14:37:15.146787068 +0000 UTC m=+699.812807688" lastFinishedPulling="2026-04-21 14:37:35.372617812 +0000 UTC m=+720.038638437" observedRunningTime="2026-04-21 14:37:35.920553274 +0000 UTC m=+720.586573915" watchObservedRunningTime="2026-04-21 14:37:39.856627666 +0000 UTC m=+724.522648309" Apr 21 14:37:40.116356 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116285 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl"] Apr 21 14:37:40.116579 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116567 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="extract" Apr 21 14:37:40.116635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116580 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="extract" Apr 21 14:37:40.116635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116599 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="pull" Apr 21 14:37:40.116635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116606 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="pull" Apr 21 14:37:40.116635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116621 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="util" Apr 21 14:37:40.116635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116626 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="util" Apr 21 14:37:40.116798 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.116675 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f1fe0b2-90b7-4692-86eb-4ff302d25972" containerName="extract" Apr 21 14:37:40.119791 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.119767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.123189 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.123167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 14:37:40.123319 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.123206 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 14:37:40.123319 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.123169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 14:37:40.123319 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.123219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-jd5l9\"" Apr 21 14:37:40.123512 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.123497 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 14:37:40.136465 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.136434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl"] Apr 21 14:37:40.247813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.247778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sp9\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-kube-api-access-q8sp9\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.247813 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.247820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.248031 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.247920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.248031 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.247974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.248031 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.248011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.248162 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.248047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.248162 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.248120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349352 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349545 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349545 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349545 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349700 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349700 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8sp9\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-kube-api-access-q8sp9\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.349799 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.349726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.350152 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.350105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.352058 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.352004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.352181 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.352098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.352313 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.352293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.352359 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.352337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.368826 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.368750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.369153 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.369129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8sp9\" (UniqueName: \"kubernetes.io/projected/a2f05579-a456-46bb-8cd7-e82c3c0a04c2-kube-api-access-q8sp9\") pod \"istiod-openshift-gateway-55ff986f96-2c7jl\" (UID: \"a2f05579-a456-46bb-8cd7-e82c3c0a04c2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.431073 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.431031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:40.612496 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.612471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl"] Apr 21 14:37:40.614685 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:37:40.614650 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f05579_a456_46bb_8cd7_e82c3c0a04c2.slice/crio-7054af82204d4cbd1b19987e3a432bf625e1dc6ce266d86905f71d404edac918 WatchSource:0}: Error finding container 7054af82204d4cbd1b19987e3a432bf625e1dc6ce266d86905f71d404edac918: Status 404 returned error can't find the container with id 7054af82204d4cbd1b19987e3a432bf625e1dc6ce266d86905f71d404edac918 Apr 21 14:37:40.880574 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:40.880479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" event={"ID":"a2f05579-a456-46bb-8cd7-e82c3c0a04c2","Type":"ContainerStarted","Data":"7054af82204d4cbd1b19987e3a432bf625e1dc6ce266d86905f71d404edac918"} Apr 21 14:37:43.389713 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:43.389673 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 14:37:43.389941 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:43.389746 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 14:37:43.895031 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:43.894995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" event={"ID":"a2f05579-a456-46bb-8cd7-e82c3c0a04c2","Type":"ContainerStarted","Data":"f71a1e30d561a7160f8cf5b54dfaf5c734f204b63238d1beed39dd3ddf9891f8"} Apr 21 14:37:43.895225 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:43.895090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:43.915777 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:43.915727 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" podStartSLOduration=1.142794432 podStartE2EDuration="3.915713271s" podCreationTimestamp="2026-04-21 14:37:40 +0000 UTC" firstStartedPulling="2026-04-21 14:37:40.616530298 +0000 UTC m=+725.282550919" lastFinishedPulling="2026-04-21 14:37:43.389449135 +0000 UTC m=+728.055469758" observedRunningTime="2026-04-21 14:37:43.913616375 +0000 UTC m=+728.579637017" watchObservedRunningTime="2026-04-21 14:37:43.915713271 +0000 UTC m=+728.581733914" Apr 21 14:37:44.900745 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:44.900711 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2c7jl" Apr 21 14:37:46.864436 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:46.864404 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-c8mtf" Apr 21 14:37:53.806240 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:37:53.806211 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-ckrj2" Apr 21 14:38:24.689472 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.689430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc"] Apr 21 14:38:24.703785 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.703758 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc"] Apr 21 14:38:24.703920 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.703870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.706921 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.706899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 14:38:24.706921 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.706909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-h9qp9\"" Apr 21 14:38:24.707774 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.707745 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 14:38:24.801946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.801905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmq8j\" (UniqueName: \"kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.802181 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.801972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.802181 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.801998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.902896 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.902859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.903064 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.902911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmq8j\" (UniqueName: \"kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.903064 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.902962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.903310 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.903290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.903349 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.903313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:24.911041 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:24.911014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmq8j\" (UniqueName: \"kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:25.013937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.013841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:25.081552 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.081519 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml"] Apr 21 14:38:25.086634 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.086609 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.091609 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.091320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml"] Apr 21 14:38:25.139520 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.139485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc"] Apr 21 14:38:25.141931 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:25.141887 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f9a4b74_e442_4289_be96_61458dae1349.slice/crio-be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505 WatchSource:0}: Error finding container be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505: Status 404 returned error can't find the container with id be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505 Apr 21 14:38:25.205130 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.205004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.205130 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.205067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.205276 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.205219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tqrq\" (UniqueName: \"kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.306375 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.306295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tqrq\" (UniqueName: \"kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.306375 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.306334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.306587 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.306472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.306640 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.306626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.306733 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.306715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.315241 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.315220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tqrq\" (UniqueName: \"kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.399399 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.399368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:25.522343 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.522317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml"] Apr 21 14:38:25.524211 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:25.524177 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08111d7_e466_49e6_82d9_594c43fba03b.slice/crio-95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a WatchSource:0}: Error finding container 95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a: Status 404 returned error can't find the container with id 95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a Apr 21 14:38:25.681774 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.681737 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx"] Apr 21 14:38:25.685232 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.685212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.691705 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.691680 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx"] Apr 21 14:38:25.811182 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.811100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.811374 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.811279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgqg\" (UniqueName: \"kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.811374 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.811327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.911787 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.911757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.911968 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.911838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.911968 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.911936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgqg\" (UniqueName: \"kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.912272 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.912247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.912361 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.912275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.920881 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.920855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgqg\" (UniqueName: \"kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:25.994861 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:25.994824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:26.046649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.046445 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f9a4b74-e442-4289-be96-61458dae1349" containerID="1699d2f47ede5dd66e9e1f47a695a142f2001f890ced7a8b2c01ca302f930d87" exitCode=0 Apr 21 14:38:26.046649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.046589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerDied","Data":"1699d2f47ede5dd66e9e1f47a695a142f2001f890ced7a8b2c01ca302f930d87"} Apr 21 14:38:26.046649 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.046625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerStarted","Data":"be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505"} Apr 21 14:38:26.049135 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.048980 2576 generic.go:358] "Generic (PLEG): container finished" podID="a08111d7-e466-49e6-82d9-594c43fba03b" containerID="8d08574b6a1e7dc5159189bea496f033785451c3d5c9e5dbc74061142f2ce2fd" exitCode=0 Apr 21 14:38:26.049250 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.049147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" event={"ID":"a08111d7-e466-49e6-82d9-594c43fba03b","Type":"ContainerDied","Data":"8d08574b6a1e7dc5159189bea496f033785451c3d5c9e5dbc74061142f2ce2fd"} Apr 21 14:38:26.049250 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.049179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" event={"ID":"a08111d7-e466-49e6-82d9-594c43fba03b","Type":"ContainerStarted","Data":"95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a"} Apr 21 14:38:26.124251 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.124191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx"] Apr 21 14:38:26.126385 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:26.126354 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod271e3623_0df8_44e8_877d_a74357dfa3cc.slice/crio-372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa WatchSource:0}: Error finding container 372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa: Status 404 returned error can't find the container with id 372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa Apr 21 14:38:26.278328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.278292 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8"] Apr 21 14:38:26.281676 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.281656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.289954 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.289926 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8"] Apr 21 14:38:26.315320 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.315278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.315461 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.315331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpc6\" (UniqueName: \"kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.315461 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.315417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.416279 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.416238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hpc6\" (UniqueName: \"kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.416481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.416331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.416481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.416377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.416723 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.416703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.416791 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.416735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.424577 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.424511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hpc6\" (UniqueName: \"kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.592624 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.592589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:26.713463 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:26.713441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8"] Apr 21 14:38:26.749189 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:26.749158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825448d9_2c0b_4843_9253_284da6d74283.slice/crio-5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085 WatchSource:0}: Error finding container 5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085: Status 404 returned error can't find the container with id 5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085 Apr 21 14:38:27.060902 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.060853 2576 generic.go:358] "Generic (PLEG): container finished" podID="825448d9-2c0b-4843-9253-284da6d74283" containerID="5a5cb639ac72b69c32b43e78972482041e156dc09eac82583f30e8269558146c" exitCode=0 Apr 21 14:38:27.061101 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.060900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" event={"ID":"825448d9-2c0b-4843-9253-284da6d74283","Type":"ContainerDied","Data":"5a5cb639ac72b69c32b43e78972482041e156dc09eac82583f30e8269558146c"} Apr 21 14:38:27.061101 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.060949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" event={"ID":"825448d9-2c0b-4843-9253-284da6d74283","Type":"ContainerStarted","Data":"5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085"} Apr 21 14:38:27.062802 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.062778 2576 generic.go:358] "Generic (PLEG): container finished" podID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerID="ef87d3a189b2304119e49826662e7051257fa3344e90c171f0a4d139f589efee" exitCode=0 Apr 21 14:38:27.062912 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.062809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" event={"ID":"271e3623-0df8-44e8-877d-a74357dfa3cc","Type":"ContainerDied","Data":"ef87d3a189b2304119e49826662e7051257fa3344e90c171f0a4d139f589efee"} Apr 21 14:38:27.062912 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.062839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" event={"ID":"271e3623-0df8-44e8-877d-a74357dfa3cc","Type":"ContainerStarted","Data":"372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa"} Apr 21 14:38:27.065681 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.065657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerStarted","Data":"4b94b17fa39fa391d7a0d9f9ad5f3b6e5990dfb0e5aac3a22c47ca56a12032d3"} Apr 21 14:38:27.067688 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.067668 2576 generic.go:358] "Generic (PLEG): container finished" podID="a08111d7-e466-49e6-82d9-594c43fba03b" containerID="8a21337a1a0643be96ea37fbd544b946530fe15b74b1618763be6896ed870b5f" exitCode=0 Apr 21 14:38:27.067765 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:27.067729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" event={"ID":"a08111d7-e466-49e6-82d9-594c43fba03b","Type":"ContainerDied","Data":"8a21337a1a0643be96ea37fbd544b946530fe15b74b1618763be6896ed870b5f"} Apr 21 14:38:28.072518 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.072487 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f9a4b74-e442-4289-be96-61458dae1349" containerID="4b94b17fa39fa391d7a0d9f9ad5f3b6e5990dfb0e5aac3a22c47ca56a12032d3" exitCode=0 Apr 21 14:38:28.072864 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.072570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerDied","Data":"4b94b17fa39fa391d7a0d9f9ad5f3b6e5990dfb0e5aac3a22c47ca56a12032d3"} Apr 21 14:38:28.074565 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.074543 2576 generic.go:358] "Generic (PLEG): container finished" podID="a08111d7-e466-49e6-82d9-594c43fba03b" containerID="257f136555acc51026754f7ff66eadb7fc842093c41b71905260b2163cab8cb7" exitCode=0 Apr 21 14:38:28.074668 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.074627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" event={"ID":"a08111d7-e466-49e6-82d9-594c43fba03b","Type":"ContainerDied","Data":"257f136555acc51026754f7ff66eadb7fc842093c41b71905260b2163cab8cb7"} Apr 21 14:38:28.076530 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.076508 2576 generic.go:358] "Generic (PLEG): container finished" podID="825448d9-2c0b-4843-9253-284da6d74283" containerID="e433cfe7d8e2be8c2415ee42b79b6650ee561d1e31adb994d1d8eba640f13fe9" exitCode=0 Apr 21 14:38:28.076609 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.076568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" event={"ID":"825448d9-2c0b-4843-9253-284da6d74283","Type":"ContainerDied","Data":"e433cfe7d8e2be8c2415ee42b79b6650ee561d1e31adb994d1d8eba640f13fe9"} Apr 21 14:38:28.078090 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.078068 2576 generic.go:358] "Generic (PLEG): container finished" podID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerID="b39dab304b70d529ac99cacae7f6162acc010b5bed8ac56c123f9952ac6f5e52" exitCode=0 Apr 21 14:38:28.078165 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:28.078148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" event={"ID":"271e3623-0df8-44e8-877d-a74357dfa3cc","Type":"ContainerDied","Data":"b39dab304b70d529ac99cacae7f6162acc010b5bed8ac56c123f9952ac6f5e52"} Apr 21 14:38:29.085412 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.085376 2576 generic.go:358] "Generic (PLEG): container finished" podID="825448d9-2c0b-4843-9253-284da6d74283" containerID="ef96950a0ffe8d53c29782c0c00f2d8093b82ac40beac9f6dae823c316fce83a" exitCode=0 Apr 21 14:38:29.085818 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.085460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" event={"ID":"825448d9-2c0b-4843-9253-284da6d74283","Type":"ContainerDied","Data":"ef96950a0ffe8d53c29782c0c00f2d8093b82ac40beac9f6dae823c316fce83a"} Apr 21 14:38:29.087058 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.087033 2576 generic.go:358] "Generic (PLEG): container finished" podID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerID="4b886fdc74f3d69f17a5adf6a53f3f0c0291a05fb0046108ddd3b9e0041b313f" exitCode=0 Apr 21 14:38:29.087182 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.087132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" event={"ID":"271e3623-0df8-44e8-877d-a74357dfa3cc","Type":"ContainerDied","Data":"4b886fdc74f3d69f17a5adf6a53f3f0c0291a05fb0046108ddd3b9e0041b313f"} Apr 21 14:38:29.088687 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.088662 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f9a4b74-e442-4289-be96-61458dae1349" containerID="a39cf0616d99a7304c7da53f17def3813b7c151fb621c7602f6e6120c69fb3ae" exitCode=0 Apr 21 14:38:29.088796 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.088703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerDied","Data":"a39cf0616d99a7304c7da53f17def3813b7c151fb621c7602f6e6120c69fb3ae"} Apr 21 14:38:29.214716 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.214692 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:29.341046 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.341009 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tqrq\" (UniqueName: \"kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq\") pod \"a08111d7-e466-49e6-82d9-594c43fba03b\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " Apr 21 14:38:29.341211 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.341077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle\") pod \"a08111d7-e466-49e6-82d9-594c43fba03b\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " Apr 21 14:38:29.341211 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.341104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util\") pod \"a08111d7-e466-49e6-82d9-594c43fba03b\" (UID: \"a08111d7-e466-49e6-82d9-594c43fba03b\") " Apr 21 14:38:29.341577 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.341543 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle" (OuterVolumeSpecName: "bundle") pod "a08111d7-e466-49e6-82d9-594c43fba03b" (UID: "a08111d7-e466-49e6-82d9-594c43fba03b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:29.343378 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.343348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq" (OuterVolumeSpecName: "kube-api-access-6tqrq") pod "a08111d7-e466-49e6-82d9-594c43fba03b" (UID: "a08111d7-e466-49e6-82d9-594c43fba03b"). InnerVolumeSpecName "kube-api-access-6tqrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:38:29.346601 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.346556 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util" (OuterVolumeSpecName: "util") pod "a08111d7-e466-49e6-82d9-594c43fba03b" (UID: "a08111d7-e466-49e6-82d9-594c43fba03b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:29.441959 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.441907 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6tqrq\" (UniqueName: \"kubernetes.io/projected/a08111d7-e466-49e6-82d9-594c43fba03b-kube-api-access-6tqrq\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:29.441959 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.441956 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:29.441959 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:29.441968 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a08111d7-e466-49e6-82d9-594c43fba03b-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.094020 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.093930 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" Apr 21 14:38:30.094020 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.093959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml" event={"ID":"a08111d7-e466-49e6-82d9-594c43fba03b","Type":"ContainerDied","Data":"95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a"} Apr 21 14:38:30.094020 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.094001 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95914500d712688c4527aecd48641c291c3fa6ffec4ef8878778dd6c903de93a" Apr 21 14:38:30.233947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.233920 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:30.271906 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.271885 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:30.274945 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.274928 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:30.348542 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348461 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle\") pod \"825448d9-2c0b-4843-9253-284da6d74283\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " Apr 21 14:38:30.348542 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hpc6\" (UniqueName: \"kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6\") pod \"825448d9-2c0b-4843-9253-284da6d74283\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " Apr 21 14:38:30.348542 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle\") pod \"7f9a4b74-e442-4289-be96-61458dae1349\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util\") pod \"825448d9-2c0b-4843-9253-284da6d74283\" (UID: \"825448d9-2c0b-4843-9253-284da6d74283\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348613 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util\") pod \"7f9a4b74-e442-4289-be96-61458dae1349\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmq8j\" (UniqueName: \"kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j\") pod \"7f9a4b74-e442-4289-be96-61458dae1349\" (UID: \"7f9a4b74-e442-4289-be96-61458dae1349\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util\") pod \"271e3623-0df8-44e8-877d-a74357dfa3cc\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle\") pod \"271e3623-0df8-44e8-877d-a74357dfa3cc\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " Apr 21 14:38:30.348815 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.348760 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbgqg\" (UniqueName: \"kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg\") pod \"271e3623-0df8-44e8-877d-a74357dfa3cc\" (UID: \"271e3623-0df8-44e8-877d-a74357dfa3cc\") " Apr 21 14:38:30.350402 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.349081 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle" (OuterVolumeSpecName: "bundle") pod "7f9a4b74-e442-4289-be96-61458dae1349" (UID: "7f9a4b74-e442-4289-be96-61458dae1349"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.350402 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.349431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle" (OuterVolumeSpecName: "bundle") pod "825448d9-2c0b-4843-9253-284da6d74283" (UID: "825448d9-2c0b-4843-9253-284da6d74283"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.350402 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.350240 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle" (OuterVolumeSpecName: "bundle") pod "271e3623-0df8-44e8-877d-a74357dfa3cc" (UID: "271e3623-0df8-44e8-877d-a74357dfa3cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.351308 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.351280 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg" (OuterVolumeSpecName: "kube-api-access-fbgqg") pod "271e3623-0df8-44e8-877d-a74357dfa3cc" (UID: "271e3623-0df8-44e8-877d-a74357dfa3cc"). InnerVolumeSpecName "kube-api-access-fbgqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:38:30.351421 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.351317 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6" (OuterVolumeSpecName: "kube-api-access-7hpc6") pod "825448d9-2c0b-4843-9253-284da6d74283" (UID: "825448d9-2c0b-4843-9253-284da6d74283"). InnerVolumeSpecName "kube-api-access-7hpc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:38:30.351862 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.351842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j" (OuterVolumeSpecName: "kube-api-access-kmq8j") pod "7f9a4b74-e442-4289-be96-61458dae1349" (UID: "7f9a4b74-e442-4289-be96-61458dae1349"). InnerVolumeSpecName "kube-api-access-kmq8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:38:30.354946 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.354915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util" (OuterVolumeSpecName: "util") pod "7f9a4b74-e442-4289-be96-61458dae1349" (UID: "7f9a4b74-e442-4289-be96-61458dae1349"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.355199 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.355179 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util" (OuterVolumeSpecName: "util") pod "825448d9-2c0b-4843-9253-284da6d74283" (UID: "825448d9-2c0b-4843-9253-284da6d74283"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.356247 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.356228 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util" (OuterVolumeSpecName: "util") pod "271e3623-0df8-44e8-877d-a74357dfa3cc" (UID: "271e3623-0df8-44e8-877d-a74357dfa3cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:38:30.449947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449901 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.449947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449944 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbgqg\" (UniqueName: \"kubernetes.io/projected/271e3623-0df8-44e8-877d-a74357dfa3cc-kube-api-access-fbgqg\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.449947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449955 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.449947 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449964 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hpc6\" (UniqueName: \"kubernetes.io/projected/825448d9-2c0b-4843-9253-284da6d74283-kube-api-access-7hpc6\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.450222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449973 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.450222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449981 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/825448d9-2c0b-4843-9253-284da6d74283-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.450222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449989 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f9a4b74-e442-4289-be96-61458dae1349-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.450222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.449996 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmq8j\" (UniqueName: \"kubernetes.io/projected/7f9a4b74-e442-4289-be96-61458dae1349-kube-api-access-kmq8j\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:30.450222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:30.450004 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/271e3623-0df8-44e8-877d-a74357dfa3cc-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:31.099750 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.099709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" event={"ID":"825448d9-2c0b-4843-9253-284da6d74283","Type":"ContainerDied","Data":"5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085"} Apr 21 14:38:31.099750 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.099743 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8" Apr 21 14:38:31.100236 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.099745 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e08c3a6aa9de41fd121332edcca645d3598c42bc17546618e0fa405101a9085" Apr 21 14:38:31.101515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.101493 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" Apr 21 14:38:31.101515 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.101500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx" event={"ID":"271e3623-0df8-44e8-877d-a74357dfa3cc","Type":"ContainerDied","Data":"372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa"} Apr 21 14:38:31.101692 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.101533 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372ca922d3b7ab71ff24e6ddd27f3567681e59b10273580133231c6eb07503aa" Apr 21 14:38:31.103127 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.103093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" event={"ID":"7f9a4b74-e442-4289-be96-61458dae1349","Type":"ContainerDied","Data":"be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505"} Apr 21 14:38:31.103229 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.103135 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc" Apr 21 14:38:31.103229 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:31.103139 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be1d861a3d3ece044d8c9988a9b66a6319a162f83f8ff80a30734e91ad7ee505" Apr 21 14:38:42.055080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055047 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757"] Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055356 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055368 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055377 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="extract" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055382 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="extract" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055394 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="pull" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="pull" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055405 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="extract" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055410 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="extract" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055416 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="pull" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055421 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="pull" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055433 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055438 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055445 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055449 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="util" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055454 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="extract" Apr 21 14:38:42.055451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055459 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055469 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="util" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="util" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055479 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="pull" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055484 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="pull" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055490 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055495 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055500 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="pull" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055505 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="pull" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055551 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="825448d9-2c0b-4843-9253-284da6d74283" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055560 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="271e3623-0df8-44e8-877d-a74357dfa3cc" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055567 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a08111d7-e466-49e6-82d9-594c43fba03b" containerName="extract" Apr 21 14:38:42.055901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.055573 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f9a4b74-e442-4289-be96-61458dae1349" containerName="extract" Apr 21 14:38:42.059863 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.059846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:42.062849 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.062825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 14:38:42.062964 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.062843 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 14:38:42.063690 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.063672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-svcxn\"" Apr 21 14:38:42.071444 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.071421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757"] Apr 21 14:38:42.152472 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.152435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs2gx\" (UniqueName: \"kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx\") pod \"limitador-operator-controller-manager-85c4996f8c-9k757\" (UID: \"fc745aec-cfad-4df0-91a2-cf280f6d5f4e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:42.253726 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.253684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs2gx\" (UniqueName: \"kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx\") pod \"limitador-operator-controller-manager-85c4996f8c-9k757\" (UID: \"fc745aec-cfad-4df0-91a2-cf280f6d5f4e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:42.264043 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.264011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs2gx\" (UniqueName: \"kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx\") pod \"limitador-operator-controller-manager-85c4996f8c-9k757\" (UID: \"fc745aec-cfad-4df0-91a2-cf280f6d5f4e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:42.370700 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.370623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:42.507060 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:42.507027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757"] Apr 21 14:38:42.511197 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:42.511170 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc745aec_cfad_4df0_91a2_cf280f6d5f4e.slice/crio-84e74a30637c38dc983d0ab37d53792a43d3d89f173562be34a3265d032d44de WatchSource:0}: Error finding container 84e74a30637c38dc983d0ab37d53792a43d3d89f173562be34a3265d032d44de: Status 404 returned error can't find the container with id 84e74a30637c38dc983d0ab37d53792a43d3d89f173562be34a3265d032d44de Apr 21 14:38:43.147460 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:43.147425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" event={"ID":"fc745aec-cfad-4df0-91a2-cf280f6d5f4e","Type":"ContainerStarted","Data":"84e74a30637c38dc983d0ab37d53792a43d3d89f173562be34a3265d032d44de"} Apr 21 14:38:47.168583 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:47.168549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" event={"ID":"fc745aec-cfad-4df0-91a2-cf280f6d5f4e","Type":"ContainerStarted","Data":"e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203"} Apr 21 14:38:47.169008 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:47.168671 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:47.190870 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:47.190821 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" podStartSLOduration=1.425271859 podStartE2EDuration="5.190805898s" podCreationTimestamp="2026-04-21 14:38:42 +0000 UTC" firstStartedPulling="2026-04-21 14:38:42.513052477 +0000 UTC m=+787.179073097" lastFinishedPulling="2026-04-21 14:38:46.278586512 +0000 UTC m=+790.944607136" observedRunningTime="2026-04-21 14:38:47.189923371 +0000 UTC m=+791.855944015" watchObservedRunningTime="2026-04-21 14:38:47.190805898 +0000 UTC m=+791.856826539" Apr 21 14:38:57.311702 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.311613 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757"] Apr 21 14:38:57.312219 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.311938 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" containerName="manager" containerID="cri-o://e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203" gracePeriod=2 Apr 21 14:38:57.313838 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.313811 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:57.328451 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.328415 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757"] Apr 21 14:38:57.336469 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.336444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q"] Apr 21 14:38:57.336768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.336754 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" containerName="manager" Apr 21 14:38:57.336768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.336768 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" containerName="manager" Apr 21 14:38:57.336883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.336819 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" containerName="manager" Apr 21 14:38:57.339888 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.339862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:57.342517 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.342485 2576 status_manager.go:895] "Failed to get status for pod" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" err="pods \"limitador-operator-controller-manager-85c4996f8c-9k757\" is forbidden: User \"system:node:ip-10-0-138-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-110.ec2.internal' and this object" Apr 21 14:38:57.352290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.352262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q"] Apr 21 14:38:57.482798 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.482727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf86s\" (UniqueName: \"kubernetes.io/projected/06d6177d-f05c-456f-a1ea-0990ce113035-kube-api-access-zf86s\") pod \"limitador-operator-controller-manager-85c4996f8c-4rx8q\" (UID: \"06d6177d-f05c-456f-a1ea-0990ce113035\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:57.539311 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.539288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:57.541963 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.541938 2576 status_manager.go:895] "Failed to get status for pod" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" err="pods \"limitador-operator-controller-manager-85c4996f8c-9k757\" is forbidden: User \"system:node:ip-10-0-138-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-110.ec2.internal' and this object" Apr 21 14:38:57.583450 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.583377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf86s\" (UniqueName: \"kubernetes.io/projected/06d6177d-f05c-456f-a1ea-0990ce113035-kube-api-access-zf86s\") pod \"limitador-operator-controller-manager-85c4996f8c-4rx8q\" (UID: \"06d6177d-f05c-456f-a1ea-0990ce113035\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:57.591784 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.591753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf86s\" (UniqueName: \"kubernetes.io/projected/06d6177d-f05c-456f-a1ea-0990ce113035-kube-api-access-zf86s\") pod \"limitador-operator-controller-manager-85c4996f8c-4rx8q\" (UID: \"06d6177d-f05c-456f-a1ea-0990ce113035\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:57.683881 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.683837 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs2gx\" (UniqueName: \"kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx\") pod \"fc745aec-cfad-4df0-91a2-cf280f6d5f4e\" (UID: \"fc745aec-cfad-4df0-91a2-cf280f6d5f4e\") " Apr 21 14:38:57.685895 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.685865 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx" (OuterVolumeSpecName: "kube-api-access-zs2gx") pod "fc745aec-cfad-4df0-91a2-cf280f6d5f4e" (UID: "fc745aec-cfad-4df0-91a2-cf280f6d5f4e"). InnerVolumeSpecName "kube-api-access-zs2gx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:38:57.688065 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.688040 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:57.784620 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.784590 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zs2gx\" (UniqueName: \"kubernetes.io/projected/fc745aec-cfad-4df0-91a2-cf280f6d5f4e-kube-api-access-zs2gx\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:38:57.811151 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.811098 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" path="/var/lib/kubelet/pods/fc745aec-cfad-4df0-91a2-cf280f6d5f4e/volumes" Apr 21 14:38:57.830606 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:57.830583 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q"] Apr 21 14:38:57.832802 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:57.832773 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d6177d_f05c_456f_a1ea_0990ce113035.slice/crio-ae0e4f8153d40c95e6cb60a49b82ba37db23e826c81edc25b6b4f31ca4675d27 WatchSource:0}: Error finding container ae0e4f8153d40c95e6cb60a49b82ba37db23e826c81edc25b6b4f31ca4675d27: Status 404 returned error can't find the container with id ae0e4f8153d40c95e6cb60a49b82ba37db23e826c81edc25b6b4f31ca4675d27 Apr 21 14:38:58.198547 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.198510 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:38:58.201840 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.201820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.204552 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.204531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-75flw\"" Apr 21 14:38:58.212678 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.212650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" event={"ID":"06d6177d-f05c-456f-a1ea-0990ce113035","Type":"ContainerStarted","Data":"2a20bbb0bfe88111f402c2ecf10be0ab10ff26d90275eff483253f0af028919c"} Apr 21 14:38:58.212798 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.212688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" event={"ID":"06d6177d-f05c-456f-a1ea-0990ce113035","Type":"ContainerStarted","Data":"ae0e4f8153d40c95e6cb60a49b82ba37db23e826c81edc25b6b4f31ca4675d27"} Apr 21 14:38:58.212861 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.212814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:38:58.212861 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.212842 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:38:58.213937 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.213910 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc745aec-cfad-4df0-91a2-cf280f6d5f4e" containerID="e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203" exitCode=0 Apr 21 14:38:58.214051 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.213987 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9k757" Apr 21 14:38:58.214051 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.214020 2576 scope.go:117] "RemoveContainer" containerID="e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203" Apr 21 14:38:58.223309 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.223291 2576 scope.go:117] "RemoveContainer" containerID="e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203" Apr 21 14:38:58.223596 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:38:58.223567 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203\": container with ID starting with e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203 not found: ID does not exist" containerID="e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203" Apr 21 14:38:58.223659 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.223607 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203"} err="failed to get container status \"e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203\": rpc error: code = NotFound desc = could not find container \"e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203\": container with ID starting with e2bff9be775b3a2937008fe6b3dd87df3bbbe5dfafa626ef82ff08add082d203 not found: ID does not exist" Apr 21 14:38:58.256034 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.255986 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" podStartSLOduration=1.255973168 podStartE2EDuration="1.255973168s" podCreationTimestamp="2026-04-21 14:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:38:58.253809314 +0000 UTC m=+802.919829979" watchObservedRunningTime="2026-04-21 14:38:58.255973168 +0000 UTC m=+802.921993897" Apr 21 14:38:58.288476 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.288433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpls2\" (UniqueName: \"kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.288631 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.288490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.389319 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.389281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpls2\" (UniqueName: \"kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.389794 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.389333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.389794 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.389708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.398254 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.398228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpls2\" (UniqueName: \"kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vr7zh\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.514240 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.514146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:38:58.652306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:58.652278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:38:58.654208 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:38:58.654174 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783ea719_adc3_43a2_a428_c205db1d5186.slice/crio-f4e00dbc73644256f81d8a83cd8c1dd59526fae74a59558a12ed9732ab95b6fe WatchSource:0}: Error finding container f4e00dbc73644256f81d8a83cd8c1dd59526fae74a59558a12ed9732ab95b6fe: Status 404 returned error can't find the container with id f4e00dbc73644256f81d8a83cd8c1dd59526fae74a59558a12ed9732ab95b6fe Apr 21 14:38:59.221461 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:38:59.221427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" event={"ID":"783ea719-adc3-43a2-a428-c205db1d5186","Type":"ContainerStarted","Data":"f4e00dbc73644256f81d8a83cd8c1dd59526fae74a59558a12ed9732ab95b6fe"} Apr 21 14:39:03.240527 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:39:03.240489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" event={"ID":"783ea719-adc3-43a2-a428-c205db1d5186","Type":"ContainerStarted","Data":"c575d26a28c3a01b5bf88abc579d541046ebbc887be4a542056573d87c2c8984"} Apr 21 14:39:03.240927 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:39:03.240616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:39:03.262230 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:39:03.262163 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" podStartSLOduration=1.48476089 podStartE2EDuration="5.262142678s" podCreationTimestamp="2026-04-21 14:38:58 +0000 UTC" firstStartedPulling="2026-04-21 14:38:58.656506568 +0000 UTC m=+803.322527188" lastFinishedPulling="2026-04-21 14:39:02.433888356 +0000 UTC m=+807.099908976" observedRunningTime="2026-04-21 14:39:03.258957507 +0000 UTC m=+807.924978162" watchObservedRunningTime="2026-04-21 14:39:03.262142678 +0000 UTC m=+807.928163321" Apr 21 14:39:09.223883 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:39:09.223841 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4rx8q" Apr 21 14:39:14.245964 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:39:14.245927 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:40:12.467833 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.467797 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj"] Apr 21 14:40:12.470529 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.470505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.472905 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.472884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 14:40:12.473685 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.473670 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qj2gg\"" Apr 21 14:40:12.473741 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.473678 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 14:40:12.480615 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.480591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj"] Apr 21 14:40:12.591080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.591017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.591080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.591085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stlx6\" (UniqueName: \"kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.591325 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.591139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.692505 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.692473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.692607 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.692529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stlx6\" (UniqueName: \"kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.692607 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.692551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.692859 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.692841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.692899 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.692875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.701016 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.700994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stlx6\" (UniqueName: \"kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.780612 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.780529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:12.907235 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:12.907209 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj"] Apr 21 14:40:12.909208 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:40:12.909173 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a5c7ff0_b79f_4d16_9679_a1011928a9bb.slice/crio-6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8 WatchSource:0}: Error finding container 6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8: Status 404 returned error can't find the container with id 6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8 Apr 21 14:40:13.494730 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:13.494698 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerID="98c11939dbe11478994206a255e9c81b29144a651d2a379a8835b7d420f386a1" exitCode=0 Apr 21 14:40:13.495091 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:13.494745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" event={"ID":"3a5c7ff0-b79f-4d16-9679-a1011928a9bb","Type":"ContainerDied","Data":"98c11939dbe11478994206a255e9c81b29144a651d2a379a8835b7d420f386a1"} Apr 21 14:40:13.495091 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:13.494770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" event={"ID":"3a5c7ff0-b79f-4d16-9679-a1011928a9bb","Type":"ContainerStarted","Data":"6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8"} Apr 21 14:40:14.501456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:14.501367 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerID="f7f3b3dcb98221e63ebf068137036c1bd1d166a95520c644c3e75c7bbf49c92d" exitCode=0 Apr 21 14:40:14.501456 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:14.501417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" event={"ID":"3a5c7ff0-b79f-4d16-9679-a1011928a9bb","Type":"ContainerDied","Data":"f7f3b3dcb98221e63ebf068137036c1bd1d166a95520c644c3e75c7bbf49c92d"} Apr 21 14:40:15.507550 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:15.507517 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerID="7f59e22afae08ff0182992b25df4a98f19a153ad8e449d86e41f759ac7f85270" exitCode=0 Apr 21 14:40:15.507915 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:15.507583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" event={"ID":"3a5c7ff0-b79f-4d16-9679-a1011928a9bb","Type":"ContainerDied","Data":"7f59e22afae08ff0182992b25df4a98f19a153ad8e449d86e41f759ac7f85270"} Apr 21 14:40:16.642740 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.642710 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:16.828150 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.828035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util\") pod \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " Apr 21 14:40:16.828150 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.828091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stlx6\" (UniqueName: \"kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6\") pod \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " Apr 21 14:40:16.828150 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.828136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle\") pod \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\" (UID: \"3a5c7ff0-b79f-4d16-9679-a1011928a9bb\") " Apr 21 14:40:16.828590 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.828555 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle" (OuterVolumeSpecName: "bundle") pod "3a5c7ff0-b79f-4d16-9679-a1011928a9bb" (UID: "3a5c7ff0-b79f-4d16-9679-a1011928a9bb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:40:16.830092 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.830068 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6" (OuterVolumeSpecName: "kube-api-access-stlx6") pod "3a5c7ff0-b79f-4d16-9679-a1011928a9bb" (UID: "3a5c7ff0-b79f-4d16-9679-a1011928a9bb"). InnerVolumeSpecName "kube-api-access-stlx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:40:16.834068 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.834028 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util" (OuterVolumeSpecName: "util") pod "3a5c7ff0-b79f-4d16-9679-a1011928a9bb" (UID: "3a5c7ff0-b79f-4d16-9679-a1011928a9bb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:40:16.929396 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.929357 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stlx6\" (UniqueName: \"kubernetes.io/projected/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-kube-api-access-stlx6\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:40:16.929396 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.929390 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-bundle\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:40:16.929396 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:16.929402 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5c7ff0-b79f-4d16-9679-a1011928a9bb-util\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:40:17.516384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:17.516355 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" Apr 21 14:40:17.516384 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:17.516362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13502b8sj" event={"ID":"3a5c7ff0-b79f-4d16-9679-a1011928a9bb","Type":"ContainerDied","Data":"6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8"} Apr 21 14:40:17.516638 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:17.516395 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6492df55110342686fe2cfd0e66251674adede18ca3af1e6901c578eb8f8bfe8" Apr 21 14:40:36.114420 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114342 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114666 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="extract" Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114681 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="extract" Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114703 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="util" Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114709 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="util" Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114716 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="pull" Apr 21 14:40:36.114768 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114721 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="pull" Apr 21 14:40:36.114950 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.114779 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a5c7ff0-b79f-4d16-9679-a1011928a9bb" containerName="extract" Apr 21 14:40:36.116595 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.116579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:36.118812 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.118792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-z8gx7\"" Apr 21 14:40:36.118929 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.118860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 14:40:36.119629 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.119613 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 14:40:36.119699 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.119613 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 14:40:36.130164 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.130141 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:40:36.184540 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.184497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6qv\" (UniqueName: \"kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv\") pod \"maas-keycloak-0\" (UID: \"42e68516-84a1-4bf1-9be6-896b65a099dd\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:36.285563 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.285524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6qv\" (UniqueName: \"kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv\") pod \"maas-keycloak-0\" (UID: \"42e68516-84a1-4bf1-9be6-896b65a099dd\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:36.295239 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.295213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6qv\" (UniqueName: \"kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv\") pod \"maas-keycloak-0\" (UID: \"42e68516-84a1-4bf1-9be6-896b65a099dd\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:36.426982 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.426939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:36.553176 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.553148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:40:36.554697 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:40:36.554663 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e68516_84a1_4bf1_9be6_896b65a099dd.slice/crio-eec3ab92a2ed06b3a70739276335ea137617cdb558b58dcd35824891a9a859b8 WatchSource:0}: Error finding container eec3ab92a2ed06b3a70739276335ea137617cdb558b58dcd35824891a9a859b8: Status 404 returned error can't find the container with id eec3ab92a2ed06b3a70739276335ea137617cdb558b58dcd35824891a9a859b8 Apr 21 14:40:36.600840 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:36.600804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"42e68516-84a1-4bf1-9be6-896b65a099dd","Type":"ContainerStarted","Data":"eec3ab92a2ed06b3a70739276335ea137617cdb558b58dcd35824891a9a859b8"} Apr 21 14:40:41.625475 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:41.625421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"42e68516-84a1-4bf1-9be6-896b65a099dd","Type":"ContainerStarted","Data":"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a"} Apr 21 14:40:41.657532 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:41.657472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.155282021 podStartE2EDuration="5.657455425s" podCreationTimestamp="2026-04-21 14:40:36 +0000 UTC" firstStartedPulling="2026-04-21 14:40:36.556061153 +0000 UTC m=+901.222081773" lastFinishedPulling="2026-04-21 14:40:41.058234543 +0000 UTC m=+905.724255177" observedRunningTime="2026-04-21 14:40:41.655702573 +0000 UTC m=+906.321723239" watchObservedRunningTime="2026-04-21 14:40:41.657455425 +0000 UTC m=+906.323476067" Apr 21 14:40:42.427907 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:42.427858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:42.429752 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:42.429718 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:43.427511 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:43.427460 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:44.428252 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:44.428199 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:45.428182 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:45.428095 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:46.427801 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:46.427759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:46.428387 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:46.428056 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:47.428044 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:47.427995 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:48.427424 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:48.427368 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:49.427771 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:49.427712 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:50.427579 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:50.427532 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:51.427529 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:51.427489 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:52.427826 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:52.427766 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:53.427979 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:53.427927 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 21 14:40:54.533913 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:54.533857 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 14:40:54.550000 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:40:54.549950 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:41:04.540682 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:04.540598 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:14.805481 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.805450 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:14.816650 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.816622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.818921 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.818896 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 14:41:14.819057 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.818974 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-h4b5p\"" Apr 21 14:41:14.819057 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.818898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 14:41:14.824506 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.824482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwq7\" (UniqueName: \"kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.825026 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.824640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.825789 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.825770 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:14.925410 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.925375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwq7\" (UniqueName: \"kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.925580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.925438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.927893 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.927870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:14.936503 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:14.936476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwq7\" (UniqueName: \"kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7\") pod \"maas-api-867cc7c758-bxws2\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:15.128496 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:15.128349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:15.261609 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:15.261579 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:15.262764 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:41:15.262734 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19decc7a_dddd_49d2_9925_4af913cd6f1e.slice/crio-22e7764b05d6f04b2597fade7941537eb391c1f6d486448e7bba50abd762a2ce WatchSource:0}: Error finding container 22e7764b05d6f04b2597fade7941537eb391c1f6d486448e7bba50abd762a2ce: Status 404 returned error can't find the container with id 22e7764b05d6f04b2597fade7941537eb391c1f6d486448e7bba50abd762a2ce Apr 21 14:41:15.264090 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:15.264069 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:41:15.789458 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:15.789414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-867cc7c758-bxws2" event={"ID":"19decc7a-dddd-49d2-9925-4af913cd6f1e","Type":"ContainerStarted","Data":"22e7764b05d6f04b2597fade7941537eb391c1f6d486448e7bba50abd762a2ce"} Apr 21 14:41:17.798727 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:17.798649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-867cc7c758-bxws2" event={"ID":"19decc7a-dddd-49d2-9925-4af913cd6f1e","Type":"ContainerStarted","Data":"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b"} Apr 21 14:41:17.799059 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:17.798741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:17.817257 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:17.817206 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-867cc7c758-bxws2" podStartSLOduration=1.666043862 podStartE2EDuration="3.81718982s" podCreationTimestamp="2026-04-21 14:41:14 +0000 UTC" firstStartedPulling="2026-04-21 14:41:15.264223202 +0000 UTC m=+939.930243821" lastFinishedPulling="2026-04-21 14:41:17.415369157 +0000 UTC m=+942.081389779" observedRunningTime="2026-04-21 14:41:17.816163062 +0000 UTC m=+942.482183701" watchObservedRunningTime="2026-04-21 14:41:17.81718982 +0000 UTC m=+942.483210461" Apr 21 14:41:23.807633 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:23.807605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:39.082275 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:39.082232 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:39.082817 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:39.082525 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" containerID="cri-o://f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a" gracePeriod=30 Apr 21 14:41:40.718626 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.718600 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:40.840041 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.839956 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6qv\" (UniqueName: \"kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv\") pod \"42e68516-84a1-4bf1-9be6-896b65a099dd\" (UID: \"42e68516-84a1-4bf1-9be6-896b65a099dd\") " Apr 21 14:41:40.842162 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.842135 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv" (OuterVolumeSpecName: "kube-api-access-6n6qv") pod "42e68516-84a1-4bf1-9be6-896b65a099dd" (UID: "42e68516-84a1-4bf1-9be6-896b65a099dd"). InnerVolumeSpecName "kube-api-access-6n6qv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:41:40.880808 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.880773 2576 generic.go:358] "Generic (PLEG): container finished" podID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerID="f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a" exitCode=143 Apr 21 14:41:40.880971 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.880844 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:40.880971 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.880845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"42e68516-84a1-4bf1-9be6-896b65a099dd","Type":"ContainerDied","Data":"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a"} Apr 21 14:41:40.880971 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.880953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"42e68516-84a1-4bf1-9be6-896b65a099dd","Type":"ContainerDied","Data":"eec3ab92a2ed06b3a70739276335ea137617cdb558b58dcd35824891a9a859b8"} Apr 21 14:41:40.880971 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.880967 2576 scope.go:117] "RemoveContainer" containerID="f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a" Apr 21 14:41:40.890582 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.890563 2576 scope.go:117] "RemoveContainer" containerID="f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a" Apr 21 14:41:40.890849 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:41:40.890831 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a\": container with ID starting with f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a not found: ID does not exist" containerID="f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a" Apr 21 14:41:40.890908 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.890861 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a"} err="failed to get container status \"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a\": rpc error: code = NotFound desc = could not find container \"f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a\": container with ID starting with f308879e45a436f8272d2f3fb3f7a8812ca6645fe6ec65e8483d1d8ea318d55a not found: ID does not exist" Apr 21 14:41:40.906211 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.906184 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:40.909621 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.909599 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:40.940707 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.940683 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n6qv\" (UniqueName: \"kubernetes.io/projected/42e68516-84a1-4bf1-9be6-896b65a099dd-kube-api-access-6n6qv\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:41:40.964964 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.964935 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:40.965316 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.965303 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" Apr 21 14:41:40.965361 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.965318 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" Apr 21 14:41:40.965395 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.965365 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" containerName="keycloak" Apr 21 14:41:40.969743 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.969725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:40.972725 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.972699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 14:41:40.972949 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.972933 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-z8gx7\"" Apr 21 14:41:40.973003 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.972972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 14:41:40.973201 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.973188 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 14:41:40.974071 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.974057 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 14:41:40.991777 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:40.991754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:41.041913 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.041879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwxgn\" (UniqueName: \"kubernetes.io/projected/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-kube-api-access-hwxgn\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.042072 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.042003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-test-realms\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.143019 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.142991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-test-realms\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.143187 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.143028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwxgn\" (UniqueName: \"kubernetes.io/projected/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-kube-api-access-hwxgn\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.143730 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.143710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-test-realms\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.152071 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.152042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwxgn\" (UniqueName: \"kubernetes.io/projected/abd00f2f-5cc5-4bb3-9cb0-f396ba776a90-kube-api-access-hwxgn\") pod \"maas-keycloak-0\" (UID: \"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90\") " pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.279400 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.279362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:41.407899 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.407860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 14:41:41.408642 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:41:41.408610 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabd00f2f_5cc5_4bb3_9cb0_f396ba776a90.slice/crio-e92ef1c91623990c7fbe67a6c567b1e2dc5d8535d06bbe3bcbf11d0e881c60e3 WatchSource:0}: Error finding container e92ef1c91623990c7fbe67a6c567b1e2dc5d8535d06bbe3bcbf11d0e881c60e3: Status 404 returned error can't find the container with id e92ef1c91623990c7fbe67a6c567b1e2dc5d8535d06bbe3bcbf11d0e881c60e3 Apr 21 14:41:41.801087 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.801009 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e68516-84a1-4bf1-9be6-896b65a099dd" path="/var/lib/kubelet/pods/42e68516-84a1-4bf1-9be6-896b65a099dd/volumes" Apr 21 14:41:41.885972 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:41.885931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90","Type":"ContainerStarted","Data":"e92ef1c91623990c7fbe67a6c567b1e2dc5d8535d06bbe3bcbf11d0e881c60e3"} Apr 21 14:41:42.894416 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:42.894317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"abd00f2f-5cc5-4bb3-9cb0-f396ba776a90","Type":"ContainerStarted","Data":"41e66ddb94b515a1ca604bff01c9773cee0bb85eff3fdd33cb0f7d469250bcbd"} Apr 21 14:41:42.918881 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:42.918814 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=2.280155453 podStartE2EDuration="2.918793167s" podCreationTimestamp="2026-04-21 14:41:40 +0000 UTC" firstStartedPulling="2026-04-21 14:41:41.409962705 +0000 UTC m=+966.075983326" lastFinishedPulling="2026-04-21 14:41:42.048600415 +0000 UTC m=+966.714621040" observedRunningTime="2026-04-21 14:41:42.914427121 +0000 UTC m=+967.580447787" watchObservedRunningTime="2026-04-21 14:41:42.918793167 +0000 UTC m=+967.584813809" Apr 21 14:41:43.280038 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:43.279952 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:43.281304 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:43.281272 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:44.280769 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:44.280718 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:45.280685 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:45.280637 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:46.280038 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:46.279993 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:47.280845 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:47.280788 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:48.280722 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:48.280669 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:49.280735 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:49.280686 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:50.279971 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:50.279917 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:51.279622 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:51.279582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:51.280136 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:51.279862 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:52.280334 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.280231 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:52.371737 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.371689 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:52.372039 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.372007 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-867cc7c758-bxws2" podUID="19decc7a-dddd-49d2-9925-4af913cd6f1e" containerName="maas-api" containerID="cri-o://abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b" gracePeriod=30 Apr 21 14:41:52.642548 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.642513 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:52.759748 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.759698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls\") pod \"19decc7a-dddd-49d2-9925-4af913cd6f1e\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " Apr 21 14:41:52.759920 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.759792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwq7\" (UniqueName: \"kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7\") pod \"19decc7a-dddd-49d2-9925-4af913cd6f1e\" (UID: \"19decc7a-dddd-49d2-9925-4af913cd6f1e\") " Apr 21 14:41:52.762527 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.762476 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7" (OuterVolumeSpecName: "kube-api-access-8mwq7") pod "19decc7a-dddd-49d2-9925-4af913cd6f1e" (UID: "19decc7a-dddd-49d2-9925-4af913cd6f1e"). InnerVolumeSpecName "kube-api-access-8mwq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:41:52.762640 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.762555 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "19decc7a-dddd-49d2-9925-4af913cd6f1e" (UID: "19decc7a-dddd-49d2-9925-4af913cd6f1e"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:41:52.860519 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.860418 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mwq7\" (UniqueName: \"kubernetes.io/projected/19decc7a-dddd-49d2-9925-4af913cd6f1e-kube-api-access-8mwq7\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:41:52.860519 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.860461 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19decc7a-dddd-49d2-9925-4af913cd6f1e-maas-api-tls\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:41:52.944829 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.944761 2576 generic.go:358] "Generic (PLEG): container finished" podID="19decc7a-dddd-49d2-9925-4af913cd6f1e" containerID="abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b" exitCode=0 Apr 21 14:41:52.944829 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.944814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-867cc7c758-bxws2" event={"ID":"19decc7a-dddd-49d2-9925-4af913cd6f1e","Type":"ContainerDied","Data":"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b"} Apr 21 14:41:52.945079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.944847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-867cc7c758-bxws2" event={"ID":"19decc7a-dddd-49d2-9925-4af913cd6f1e","Type":"ContainerDied","Data":"22e7764b05d6f04b2597fade7941537eb391c1f6d486448e7bba50abd762a2ce"} Apr 21 14:41:52.945079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.944866 2576 scope.go:117] "RemoveContainer" containerID="abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b" Apr 21 14:41:52.945079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.944868 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-867cc7c758-bxws2" Apr 21 14:41:52.957901 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.957881 2576 scope.go:117] "RemoveContainer" containerID="abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b" Apr 21 14:41:52.958408 ip-10-0-138-110 kubenswrapper[2576]: E0421 14:41:52.958368 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b\": container with ID starting with abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b not found: ID does not exist" containerID="abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b" Apr 21 14:41:52.958528 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.958419 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b"} err="failed to get container status \"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b\": rpc error: code = NotFound desc = could not find container \"abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b\": container with ID starting with abff7a71db69b61262dfde75b752c1ed55149353b34b6652f7565bd2da4da38b not found: ID does not exist" Apr 21 14:41:52.973870 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.973824 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:52.980079 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:52.980036 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-867cc7c758-bxws2"] Apr 21 14:41:53.280334 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:53.280284 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:53.798418 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:53.798380 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19decc7a-dddd-49d2-9925-4af913cd6f1e" path="/var/lib/kubelet/pods/19decc7a-dddd-49d2-9925-4af913cd6f1e/volumes" Apr 21 14:41:54.279892 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:54.279836 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:55.280594 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:55.280544 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.39:9000/health/started\": dial tcp 10.133.0.39:9000: connect: connection refused" Apr 21 14:41:56.414552 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:56.414502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 14:41:56.434529 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:41:56.434480 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="abd00f2f-5cc5-4bb3-9cb0-f396ba776a90" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:42:06.420793 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:06.420758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 14:42:26.784596 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.784555 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2"] Apr 21 14:42:26.785146 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.785014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19decc7a-dddd-49d2-9925-4af913cd6f1e" containerName="maas-api" Apr 21 14:42:26.785146 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.785034 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="19decc7a-dddd-49d2-9925-4af913cd6f1e" containerName="maas-api" Apr 21 14:42:26.785146 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.785133 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="19decc7a-dddd-49d2-9925-4af913cd6f1e" containerName="maas-api" Apr 21 14:42:26.788390 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.788371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.793157 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.793130 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 14:42:26.793157 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.793138 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 14:42:26.793328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.793181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-m4nxz\"" Apr 21 14:42:26.793328 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.793224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 14:42:26.802222 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.802200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2"] Apr 21 14:42:26.834315 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl64t\" (UniqueName: \"kubernetes.io/projected/d9380417-dacc-4995-bf9b-73200eba0fa9-kube-api-access-tl64t\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.834498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.834498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.834498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.834498 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.834707 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.834512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9380417-dacc-4995-bf9b-73200eba0fa9-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.935974 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.935935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936204 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936204 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9380417-dacc-4995-bf9b-73200eba0fa9-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936204 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl64t\" (UniqueName: \"kubernetes.io/projected/d9380417-dacc-4995-bf9b-73200eba0fa9-kube-api-access-tl64t\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936204 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936204 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936562 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936626 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.936880 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.936861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.938968 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.938938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9380417-dacc-4995-bf9b-73200eba0fa9-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.939149 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.939125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9380417-dacc-4995-bf9b-73200eba0fa9-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:26.944966 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:26.944943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl64t\" (UniqueName: \"kubernetes.io/projected/d9380417-dacc-4995-bf9b-73200eba0fa9-kube-api-access-tl64t\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2\" (UID: \"d9380417-dacc-4995-bf9b-73200eba0fa9\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:27.098188 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:27.098074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:27.238860 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:27.238826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2"] Apr 21 14:42:27.240904 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:42:27.240881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9380417_dacc_4995_bf9b_73200eba0fa9.slice/crio-30fc93643bf0f910f6072544a8d7693944c708575b0d63359610fde006f74c20 WatchSource:0}: Error finding container 30fc93643bf0f910f6072544a8d7693944c708575b0d63359610fde006f74c20: Status 404 returned error can't find the container with id 30fc93643bf0f910f6072544a8d7693944c708575b0d63359610fde006f74c20 Apr 21 14:42:28.087592 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:28.087545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" event={"ID":"d9380417-dacc-4995-bf9b-73200eba0fa9","Type":"ContainerStarted","Data":"30fc93643bf0f910f6072544a8d7693944c708575b0d63359610fde006f74c20"} Apr 21 14:42:35.118344 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:35.118303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" event={"ID":"d9380417-dacc-4995-bf9b-73200eba0fa9","Type":"ContainerStarted","Data":"bacb0db3eceb2fa3bc98e696665e4325fb1fdb6905b0f96cc352073f5a882851"} Apr 21 14:42:43.150754 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:43.150721 2576 generic.go:358] "Generic (PLEG): container finished" podID="d9380417-dacc-4995-bf9b-73200eba0fa9" containerID="bacb0db3eceb2fa3bc98e696665e4325fb1fdb6905b0f96cc352073f5a882851" exitCode=0 Apr 21 14:42:43.151162 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:43.150800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" event={"ID":"d9380417-dacc-4995-bf9b-73200eba0fa9","Type":"ContainerDied","Data":"bacb0db3eceb2fa3bc98e696665e4325fb1fdb6905b0f96cc352073f5a882851"} Apr 21 14:42:44.577355 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.577253 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq"] Apr 21 14:42:44.579913 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.579890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.582838 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.582815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 14:42:44.594849 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.594817 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq"] Apr 21 14:42:44.694362 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.694558 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e04297e8-e515-47c2-ac1b-cc65d1ac637c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.694558 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.694558 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vp9\" (UniqueName: \"kubernetes.io/projected/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kube-api-access-t7vp9\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.694714 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.694714 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.694648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795418 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e04297e8-e515-47c2-ac1b-cc65d1ac637c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795580 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795721 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vp9\" (UniqueName: \"kubernetes.io/projected/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kube-api-access-t7vp9\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795721 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.795958 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.796043 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.795988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.796043 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.796019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.797920 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.797883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e04297e8-e515-47c2-ac1b-cc65d1ac637c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.798181 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.798163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e04297e8-e515-47c2-ac1b-cc65d1ac637c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.806699 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.806676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vp9\" (UniqueName: \"kubernetes.io/projected/e04297e8-e515-47c2-ac1b-cc65d1ac637c-kube-api-access-t7vp9\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq\" (UID: \"e04297e8-e515-47c2-ac1b-cc65d1ac637c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:44.891695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:44.891667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:45.024869 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.024836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq"] Apr 21 14:42:45.026199 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:42:45.026171 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04297e8_e515_47c2_ac1b_cc65d1ac637c.slice/crio-c18030f7922a02ebb437af4a0f0e0039b734d9ebb77b969cc5d927de2671ae7e WatchSource:0}: Error finding container c18030f7922a02ebb437af4a0f0e0039b734d9ebb77b969cc5d927de2671ae7e: Status 404 returned error can't find the container with id c18030f7922a02ebb437af4a0f0e0039b734d9ebb77b969cc5d927de2671ae7e Apr 21 14:42:45.160305 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.160211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" event={"ID":"d9380417-dacc-4995-bf9b-73200eba0fa9","Type":"ContainerStarted","Data":"a8a56ea0caaffb12196a6e0c6a5438002c6455604d65a27b86f4e69afe12f311"} Apr 21 14:42:45.160470 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.160444 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:45.161630 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.161607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" event={"ID":"e04297e8-e515-47c2-ac1b-cc65d1ac637c","Type":"ContainerStarted","Data":"37b64cf1c71653e5a5cdd05fa129e9d831621ed4c8c3d55fb4cae5933a0b0595"} Apr 21 14:42:45.161630 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.161632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" event={"ID":"e04297e8-e515-47c2-ac1b-cc65d1ac637c","Type":"ContainerStarted","Data":"c18030f7922a02ebb437af4a0f0e0039b734d9ebb77b969cc5d927de2671ae7e"} Apr 21 14:42:45.181001 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:45.180951 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" podStartSLOduration=2.108774449 podStartE2EDuration="19.180938544s" podCreationTimestamp="2026-04-21 14:42:26 +0000 UTC" firstStartedPulling="2026-04-21 14:42:27.243170009 +0000 UTC m=+1011.909190629" lastFinishedPulling="2026-04-21 14:42:44.315334095 +0000 UTC m=+1028.981354724" observedRunningTime="2026-04-21 14:42:45.178490671 +0000 UTC m=+1029.844511326" watchObservedRunningTime="2026-04-21 14:42:45.180938544 +0000 UTC m=+1029.846959189" Apr 21 14:42:50.886060 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:50.885981 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw"] Apr 21 14:42:50.918042 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:50.918012 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw"] Apr 21 14:42:50.918213 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:50.918155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:50.920560 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:50.920535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 14:42:51.050606 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kljf2\" (UniqueName: \"kubernetes.io/projected/5ca78afe-f98a-49be-a956-4849de4a7e2c-kube-api-access-kljf2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.050795 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.050795 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.050795 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.050795 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.050974 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.050809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca78afe-f98a-49be-a956-4849de4a7e2c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151635 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kljf2\" (UniqueName: \"kubernetes.io/projected/5ca78afe-f98a-49be-a956-4849de4a7e2c-kube-api-access-kljf2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151807 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151807 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151807 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151807 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.151807 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.151759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca78afe-f98a-49be-a956-4849de4a7e2c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.152165 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.152104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.152165 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.152144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.152165 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.152154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.154040 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.154017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ca78afe-f98a-49be-a956-4849de4a7e2c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.154140 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.154097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca78afe-f98a-49be-a956-4849de4a7e2c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.159695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.159666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kljf2\" (UniqueName: \"kubernetes.io/projected/5ca78afe-f98a-49be-a956-4849de4a7e2c-kube-api-access-kljf2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-dktgw\" (UID: \"5ca78afe-f98a-49be-a956-4849de4a7e2c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.185637 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.185607 2576 generic.go:358] "Generic (PLEG): container finished" podID="e04297e8-e515-47c2-ac1b-cc65d1ac637c" containerID="37b64cf1c71653e5a5cdd05fa129e9d831621ed4c8c3d55fb4cae5933a0b0595" exitCode=0 Apr 21 14:42:51.185762 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.185681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" event={"ID":"e04297e8-e515-47c2-ac1b-cc65d1ac637c","Type":"ContainerDied","Data":"37b64cf1c71653e5a5cdd05fa129e9d831621ed4c8c3d55fb4cae5933a0b0595"} Apr 21 14:42:51.229304 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.228569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:51.365009 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:51.364985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw"] Apr 21 14:42:51.366674 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:42:51.366648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca78afe_f98a_49be_a956_4849de4a7e2c.slice/crio-51d10bdf21cc010a7536f8375cf650968d4a0ac5a0647c59660d1cfb63d40028 WatchSource:0}: Error finding container 51d10bdf21cc010a7536f8375cf650968d4a0ac5a0647c59660d1cfb63d40028: Status 404 returned error can't find the container with id 51d10bdf21cc010a7536f8375cf650968d4a0ac5a0647c59660d1cfb63d40028 Apr 21 14:42:52.191428 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:52.191394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" event={"ID":"e04297e8-e515-47c2-ac1b-cc65d1ac637c","Type":"ContainerStarted","Data":"9a8ac263bb5e692272785d8aed7b0c4644408f7f681a08270d9eb7375ec42e28"} Apr 21 14:42:52.191839 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:52.191627 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:42:52.192899 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:52.192874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" event={"ID":"5ca78afe-f98a-49be-a956-4849de4a7e2c","Type":"ContainerStarted","Data":"6a7406130acaa1d0f11446140631f20849900de1f93bf3e9182cb25cc15742da"} Apr 21 14:42:52.192899 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:52.192901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" event={"ID":"5ca78afe-f98a-49be-a956-4849de4a7e2c","Type":"ContainerStarted","Data":"51d10bdf21cc010a7536f8375cf650968d4a0ac5a0647c59660d1cfb63d40028"} Apr 21 14:42:52.276368 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:52.276306 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" podStartSLOduration=7.765959919 podStartE2EDuration="8.276286396s" podCreationTimestamp="2026-04-21 14:42:44 +0000 UTC" firstStartedPulling="2026-04-21 14:42:51.186370192 +0000 UTC m=+1035.852390811" lastFinishedPulling="2026-04-21 14:42:51.696696663 +0000 UTC m=+1036.362717288" observedRunningTime="2026-04-21 14:42:52.237979241 +0000 UTC m=+1036.903999884" watchObservedRunningTime="2026-04-21 14:42:52.276286396 +0000 UTC m=+1036.942307038" Apr 21 14:42:56.178409 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:56.178377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2" Apr 21 14:42:57.212504 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:57.212472 2576 generic.go:358] "Generic (PLEG): container finished" podID="5ca78afe-f98a-49be-a956-4849de4a7e2c" containerID="6a7406130acaa1d0f11446140631f20849900de1f93bf3e9182cb25cc15742da" exitCode=0 Apr 21 14:42:57.212860 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:57.212537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" event={"ID":"5ca78afe-f98a-49be-a956-4849de4a7e2c","Type":"ContainerDied","Data":"6a7406130acaa1d0f11446140631f20849900de1f93bf3e9182cb25cc15742da"} Apr 21 14:42:58.218008 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:58.217967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" event={"ID":"5ca78afe-f98a-49be-a956-4849de4a7e2c","Type":"ContainerStarted","Data":"0c22a225324543dd86c938f5be8e73c34a2e82917f388779a8e81fef93c8c12a"} Apr 21 14:42:58.218426 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:58.218243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:42:58.236811 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:42:58.236746 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" podStartSLOduration=7.880244382 podStartE2EDuration="8.236727072s" podCreationTimestamp="2026-04-21 14:42:50 +0000 UTC" firstStartedPulling="2026-04-21 14:42:57.213172514 +0000 UTC m=+1041.879193134" lastFinishedPulling="2026-04-21 14:42:57.569655201 +0000 UTC m=+1042.235675824" observedRunningTime="2026-04-21 14:42:58.235457967 +0000 UTC m=+1042.901478611" watchObservedRunningTime="2026-04-21 14:42:58.236727072 +0000 UTC m=+1042.902747716" Apr 21 14:43:00.685505 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.685472 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k"] Apr 21 14:43:00.751008 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.750968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k"] Apr 21 14:43:00.751182 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.751158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.754460 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.754434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 14:43:00.835232 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.835424 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvl9j\" (UniqueName: \"kubernetes.io/projected/748dfa61-67e2-47c0-842d-f41dacfcdc75-kube-api-access-rvl9j\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.835424 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/748dfa61-67e2-47c0-842d-f41dacfcdc75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.835540 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.835540 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.835540 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.835535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.936917 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.936827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.936917 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.936883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937166 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.936972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl9j\" (UniqueName: \"kubernetes.io/projected/748dfa61-67e2-47c0-842d-f41dacfcdc75-kube-api-access-rvl9j\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937166 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.937004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/748dfa61-67e2-47c0-842d-f41dacfcdc75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937166 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.937071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937166 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.937137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937372 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.937295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.937669 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.937639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.938024 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.938000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.940063 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.940039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/748dfa61-67e2-47c0-842d-f41dacfcdc75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.940216 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.940200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/748dfa61-67e2-47c0-842d-f41dacfcdc75-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:00.945949 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:00.945918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvl9j\" (UniqueName: \"kubernetes.io/projected/748dfa61-67e2-47c0-842d-f41dacfcdc75-kube-api-access-rvl9j\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k\" (UID: \"748dfa61-67e2-47c0-842d-f41dacfcdc75\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:01.061259 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:01.061216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:01.195784 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:01.195752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k"] Apr 21 14:43:01.196943 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:43:01.196914 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod748dfa61_67e2_47c0_842d_f41dacfcdc75.slice/crio-3ffa1e40bc83bcfeccb2b8a4366ebeac77f5570ef58e462cfbdced8ff13c2b89 WatchSource:0}: Error finding container 3ffa1e40bc83bcfeccb2b8a4366ebeac77f5570ef58e462cfbdced8ff13c2b89: Status 404 returned error can't find the container with id 3ffa1e40bc83bcfeccb2b8a4366ebeac77f5570ef58e462cfbdced8ff13c2b89 Apr 21 14:43:01.241455 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:01.241427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" event={"ID":"748dfa61-67e2-47c0-842d-f41dacfcdc75","Type":"ContainerStarted","Data":"3ffa1e40bc83bcfeccb2b8a4366ebeac77f5570ef58e462cfbdced8ff13c2b89"} Apr 21 14:43:02.247077 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:02.247040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" event={"ID":"748dfa61-67e2-47c0-842d-f41dacfcdc75","Type":"ContainerStarted","Data":"51dc83fe9d9917fd27b148aff7c25a45f6123b6e5e2ebb5f446e58ef68b494e7"} Apr 21 14:43:03.209602 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:03.209569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq" Apr 21 14:43:07.270975 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:07.270941 2576 generic.go:358] "Generic (PLEG): container finished" podID="748dfa61-67e2-47c0-842d-f41dacfcdc75" containerID="51dc83fe9d9917fd27b148aff7c25a45f6123b6e5e2ebb5f446e58ef68b494e7" exitCode=0 Apr 21 14:43:07.271413 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:07.271021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" event={"ID":"748dfa61-67e2-47c0-842d-f41dacfcdc75","Type":"ContainerDied","Data":"51dc83fe9d9917fd27b148aff7c25a45f6123b6e5e2ebb5f446e58ef68b494e7"} Apr 21 14:43:08.276426 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:08.276392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" event={"ID":"748dfa61-67e2-47c0-842d-f41dacfcdc75","Type":"ContainerStarted","Data":"c7152bb1e911e19d1d55730fc7524c13b7adf30234caf3ed26e38d68b4616bc9"} Apr 21 14:43:08.276801 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:08.276601 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:08.299538 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:08.299490 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" podStartSLOduration=8.041811913 podStartE2EDuration="8.299474337s" podCreationTimestamp="2026-04-21 14:43:00 +0000 UTC" firstStartedPulling="2026-04-21 14:43:07.271682478 +0000 UTC m=+1051.937703098" lastFinishedPulling="2026-04-21 14:43:07.529344891 +0000 UTC m=+1052.195365522" observedRunningTime="2026-04-21 14:43:08.296104542 +0000 UTC m=+1052.962125183" watchObservedRunningTime="2026-04-21 14:43:08.299474337 +0000 UTC m=+1052.965494978" Apr 21 14:43:09.239291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:09.239259 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-dktgw" Apr 21 14:43:18.994897 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:18.994856 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf"] Apr 21 14:43:18.999764 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:18.999741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.002083 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.002064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 14:43:19.007405 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.007377 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf"] Apr 21 14:43:19.095362 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7h7t\" (UniqueName: \"kubernetes.io/projected/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kube-api-access-b7h7t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.095502 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.095502 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.095502 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.095502 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.095655 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.095509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196291 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196470 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196470 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196470 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196653 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196716 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7h7t\" (UniqueName: \"kubernetes.io/projected/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kube-api-access-b7h7t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196799 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196855 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.196932 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.196913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.198586 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.198562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.198872 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.198853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.205044 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.205020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7h7t\" (UniqueName: \"kubernetes.io/projected/90d9408e-fe9c-4dc2-96bf-a86e59c997ca-kube-api-access-b7h7t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf\" (UID: \"90d9408e-fe9c-4dc2-96bf-a86e59c997ca\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.295516 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.295434 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k" Apr 21 14:43:19.311695 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.311664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:19.661679 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:19.661649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf"] Apr 21 14:43:19.662858 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:43:19.662833 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d9408e_fe9c_4dc2_96bf_a86e59c997ca.slice/crio-c239f50979ba457856222e982cc0768b3b0bd0b2beb638dab2f9aedca2608f04 WatchSource:0}: Error finding container c239f50979ba457856222e982cc0768b3b0bd0b2beb638dab2f9aedca2608f04: Status 404 returned error can't find the container with id c239f50979ba457856222e982cc0768b3b0bd0b2beb638dab2f9aedca2608f04 Apr 21 14:43:20.020069 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.019988 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh"] Apr 21 14:43:20.024080 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.024058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.026884 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.026864 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 14:43:20.045163 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.045131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh"] Apr 21 14:43:20.103029 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.102992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.103029 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.103032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.103290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.103065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q5jc\" (UniqueName: \"kubernetes.io/projected/96a5f987-94e5-4122-b2bb-a47ed2618191-kube-api-access-9q5jc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.103290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.103089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.103290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.103179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.103290 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.103216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a5f987-94e5-4122-b2bb-a47ed2618191-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.203769 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.203722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.203950 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.203773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.203950 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.203913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q5jc\" (UniqueName: \"kubernetes.io/projected/96a5f987-94e5-4122-b2bb-a47ed2618191-kube-api-access-9q5jc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204064 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.203960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204064 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.204057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204208 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.204122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a5f987-94e5-4122-b2bb-a47ed2618191-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204266 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.204201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204266 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.204253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.204395 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.204371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.206583 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.206559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96a5f987-94e5-4122-b2bb-a47ed2618191-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.206866 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.206849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a5f987-94e5-4122-b2bb-a47ed2618191-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.215053 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.215028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q5jc\" (UniqueName: \"kubernetes.io/projected/96a5f987-94e5-4122-b2bb-a47ed2618191-kube-api-access-9q5jc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh\" (UID: \"96a5f987-94e5-4122-b2bb-a47ed2618191\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.324022 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.323934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" event={"ID":"90d9408e-fe9c-4dc2-96bf-a86e59c997ca","Type":"ContainerStarted","Data":"d4ecf5a331d1c4b8c0d2acc7b9ce737c643bf5064bba12e341d2ef64552ac468"} Apr 21 14:43:20.324022 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.323972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" event={"ID":"90d9408e-fe9c-4dc2-96bf-a86e59c997ca","Type":"ContainerStarted","Data":"c239f50979ba457856222e982cc0768b3b0bd0b2beb638dab2f9aedca2608f04"} Apr 21 14:43:20.337620 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.337582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:20.492717 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:20.492689 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh"] Apr 21 14:43:20.496007 ip-10-0-138-110 kubenswrapper[2576]: W0421 14:43:20.495981 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a5f987_94e5_4122_b2bb_a47ed2618191.slice/crio-a28a06f436ee4a13d7f46cc2a5bc9ce7fe3f7ac229476bc28a02bc5ff8b4dbcb WatchSource:0}: Error finding container a28a06f436ee4a13d7f46cc2a5bc9ce7fe3f7ac229476bc28a02bc5ff8b4dbcb: Status 404 returned error can't find the container with id a28a06f436ee4a13d7f46cc2a5bc9ce7fe3f7ac229476bc28a02bc5ff8b4dbcb Apr 21 14:43:21.330890 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:21.330819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" event={"ID":"96a5f987-94e5-4122-b2bb-a47ed2618191","Type":"ContainerStarted","Data":"b0609808cf4a83411e8ee81da63d408d65c2b9a207bb13f3e63f17c6f929735f"} Apr 21 14:43:21.330890 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:21.330860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" event={"ID":"96a5f987-94e5-4122-b2bb-a47ed2618191","Type":"ContainerStarted","Data":"a28a06f436ee4a13d7f46cc2a5bc9ce7fe3f7ac229476bc28a02bc5ff8b4dbcb"} Apr 21 14:43:26.356887 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:26.356853 2576 generic.go:358] "Generic (PLEG): container finished" podID="90d9408e-fe9c-4dc2-96bf-a86e59c997ca" containerID="d4ecf5a331d1c4b8c0d2acc7b9ce737c643bf5064bba12e341d2ef64552ac468" exitCode=0 Apr 21 14:43:26.357342 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:26.356936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" event={"ID":"90d9408e-fe9c-4dc2-96bf-a86e59c997ca","Type":"ContainerDied","Data":"d4ecf5a331d1c4b8c0d2acc7b9ce737c643bf5064bba12e341d2ef64552ac468"} Apr 21 14:43:26.358350 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:26.358221 2576 generic.go:358] "Generic (PLEG): container finished" podID="96a5f987-94e5-4122-b2bb-a47ed2618191" containerID="b0609808cf4a83411e8ee81da63d408d65c2b9a207bb13f3e63f17c6f929735f" exitCode=0 Apr 21 14:43:26.358350 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:26.358272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" event={"ID":"96a5f987-94e5-4122-b2bb-a47ed2618191","Type":"ContainerDied","Data":"b0609808cf4a83411e8ee81da63d408d65c2b9a207bb13f3e63f17c6f929735f"} Apr 21 14:43:27.363756 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.363710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" event={"ID":"90d9408e-fe9c-4dc2-96bf-a86e59c997ca","Type":"ContainerStarted","Data":"b9bd8e676fa51d3ed5d6cc9c6f9590e4a4d2a74b277352a08a3555c0132f8e6d"} Apr 21 14:43:27.364306 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.363977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:43:27.365555 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.365533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" event={"ID":"96a5f987-94e5-4122-b2bb-a47ed2618191","Type":"ContainerStarted","Data":"1a5a584d2d58598130e127573c492ea92c3fd470f148f8e40fa342365dcffbe5"} Apr 21 14:43:27.365728 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.365714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:27.387560 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.387510 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" podStartSLOduration=9.172972142 podStartE2EDuration="9.387494117s" podCreationTimestamp="2026-04-21 14:43:18 +0000 UTC" firstStartedPulling="2026-04-21 14:43:26.357773685 +0000 UTC m=+1071.023794305" lastFinishedPulling="2026-04-21 14:43:26.572295648 +0000 UTC m=+1071.238316280" observedRunningTime="2026-04-21 14:43:27.384270457 +0000 UTC m=+1072.050291106" watchObservedRunningTime="2026-04-21 14:43:27.387494117 +0000 UTC m=+1072.053514760" Apr 21 14:43:27.409929 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:27.409877 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" podStartSLOduration=8.175233333 podStartE2EDuration="8.409862898s" podCreationTimestamp="2026-04-21 14:43:19 +0000 UTC" firstStartedPulling="2026-04-21 14:43:26.35882873 +0000 UTC m=+1071.024849350" lastFinishedPulling="2026-04-21 14:43:26.593458294 +0000 UTC m=+1071.259478915" observedRunningTime="2026-04-21 14:43:27.406524526 +0000 UTC m=+1072.072545167" watchObservedRunningTime="2026-04-21 14:43:27.409862898 +0000 UTC m=+1072.075883539" Apr 21 14:43:38.384535 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:38.384500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh" Apr 21 14:43:38.384964 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:43:38.384949 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf" Apr 21 14:55:47.023225 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.023189 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:55:47.023671 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.023412 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" podUID="783ea719-adc3-43a2-a428-c205db1d5186" containerName="manager" containerID="cri-o://c575d26a28c3a01b5bf88abc579d541046ebbc887be4a542056573d87c2c8984" gracePeriod=10 Apr 21 14:55:47.341941 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.341907 2576 generic.go:358] "Generic (PLEG): container finished" podID="783ea719-adc3-43a2-a428-c205db1d5186" containerID="c575d26a28c3a01b5bf88abc579d541046ebbc887be4a542056573d87c2c8984" exitCode=0 Apr 21 14:55:47.342102 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.341942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" event={"ID":"783ea719-adc3-43a2-a428-c205db1d5186","Type":"ContainerDied","Data":"c575d26a28c3a01b5bf88abc579d541046ebbc887be4a542056573d87c2c8984"} Apr 21 14:55:47.380566 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.380542 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:55:47.433612 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.432301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume\") pod \"783ea719-adc3-43a2-a428-c205db1d5186\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " Apr 21 14:55:47.433612 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.432378 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpls2\" (UniqueName: \"kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2\") pod \"783ea719-adc3-43a2-a428-c205db1d5186\" (UID: \"783ea719-adc3-43a2-a428-c205db1d5186\") " Apr 21 14:55:47.434137 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.434056 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "783ea719-adc3-43a2-a428-c205db1d5186" (UID: "783ea719-adc3-43a2-a428-c205db1d5186"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:55:47.435716 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.435659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2" (OuterVolumeSpecName: "kube-api-access-kpls2") pod "783ea719-adc3-43a2-a428-c205db1d5186" (UID: "783ea719-adc3-43a2-a428-c205db1d5186"). InnerVolumeSpecName "kube-api-access-kpls2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:55:47.533654 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.533615 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/783ea719-adc3-43a2-a428-c205db1d5186-extensions-socket-volume\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:55:47.533654 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:47.533646 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpls2\" (UniqueName: \"kubernetes.io/projected/783ea719-adc3-43a2-a428-c205db1d5186-kube-api-access-kpls2\") on node \"ip-10-0-138-110.ec2.internal\" DevicePath \"\"" Apr 21 14:55:48.347019 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:48.346980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" event={"ID":"783ea719-adc3-43a2-a428-c205db1d5186","Type":"ContainerDied","Data":"f4e00dbc73644256f81d8a83cd8c1dd59526fae74a59558a12ed9732ab95b6fe"} Apr 21 14:55:48.347521 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:48.347029 2576 scope.go:117] "RemoveContainer" containerID="c575d26a28c3a01b5bf88abc579d541046ebbc887be4a542056573d87c2c8984" Apr 21 14:55:48.347521 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:48.347037 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh" Apr 21 14:55:48.366562 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:48.366536 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:55:48.370377 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:48.370351 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vr7zh"] Apr 21 14:55:49.796862 ip-10-0-138-110 kubenswrapper[2576]: I0421 14:55:49.796825 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783ea719-adc3-43a2-a428-c205db1d5186" path="/var/lib/kubelet/pods/783ea719-adc3-43a2-a428-c205db1d5186/volumes" Apr 21 15:10:24.509765 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:24.509678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ckrj2_bc4e480a-1a65-41e7-85b5-31e172a4dbca/manager/0.log" Apr 21 15:10:24.849309 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:24.849231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c8mtf_9307725b-46ef-4197-880a-edc49567bde2/manager/2.log" Apr 21 15:10:25.242476 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:25.242446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7df645bd74-qg9dd_d06ff8e8-a678-4df5-a145-0b0b2b8a94f2/manager/0.log" Apr 21 15:10:26.094862 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.094823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/util/0.log" Apr 21 15:10:26.100880 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.100857 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/pull/0.log" Apr 21 15:10:26.106899 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.106881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/extract/0.log" Apr 21 15:10:26.215150 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.215103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/util/0.log" Apr 21 15:10:26.224247 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.224226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/pull/0.log" Apr 21 15:10:26.230394 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.230372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/extract/0.log" Apr 21 15:10:26.338848 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.338812 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/util/0.log" Apr 21 15:10:26.344316 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.344290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/pull/0.log" Apr 21 15:10:26.349973 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.349914 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/extract/0.log" Apr 21 15:10:26.455689 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.455661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/pull/0.log" Apr 21 15:10:26.461973 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.461954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/extract/0.log" Apr 21 15:10:26.469649 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:26.469625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/util/0.log" Apr 21 15:10:27.371575 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:27.371541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-4rx8q_06d6177d-f05c-456f-a1ea-0990ce113035/manager/0.log" Apr 21 15:10:27.836329 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:27.836303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2c7jl_a2f05579-a456-46bb-8cd7-e82c3c0a04c2/discovery/0.log" Apr 21 15:10:27.944382 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:27.944349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b77d4bb4f-5d8ls_66bb68fe-aa1c-46cd-b3cb-0afd155918f5/kube-auth-proxy/0.log" Apr 21 15:10:28.603271 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.603241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k_748dfa61-67e2-47c0-842d-f41dacfcdc75/main/0.log" Apr 21 15:10:28.609949 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.609923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-qcr9k_748dfa61-67e2-47c0-842d-f41dacfcdc75/storage-initializer/0.log" Apr 21 15:10:28.712606 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.712574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf_90d9408e-fe9c-4dc2-96bf-a86e59c997ca/storage-initializer/0.log" Apr 21 15:10:28.719625 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.719602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-drzbf_90d9408e-fe9c-4dc2-96bf-a86e59c997ca/main/0.log" Apr 21 15:10:28.833615 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.833591 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-dktgw_5ca78afe-f98a-49be-a956-4849de4a7e2c/storage-initializer/0.log" Apr 21 15:10:28.840438 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.840416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-dktgw_5ca78afe-f98a-49be-a956-4849de4a7e2c/main/0.log" Apr 21 15:10:28.948644 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.948618 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq_e04297e8-e515-47c2-ac1b-cc65d1ac637c/main/0.log" Apr 21 15:10:28.955567 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:28.955546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccfcwjq_e04297e8-e515-47c2-ac1b-cc65d1ac637c/storage-initializer/0.log" Apr 21 15:10:29.060855 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:29.060825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh_96a5f987-94e5-4122-b2bb-a47ed2618191/storage-initializer/0.log" Apr 21 15:10:29.069464 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:29.069439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cv7xh_96a5f987-94e5-4122-b2bb-a47ed2618191/main/0.log" Apr 21 15:10:29.181494 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:29.181469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2_d9380417-dacc-4995-bf9b-73200eba0fa9/storage-initializer/0.log" Apr 21 15:10:29.189495 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:29.189471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5h2n2_d9380417-dacc-4995-bf9b-73200eba0fa9/main/0.log" Apr 21 15:10:36.497598 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:36.497567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s2hcn_14f6c67f-3406-49c7-bc12-9e14ef76f972/global-pull-secret-syncer/0.log" Apr 21 15:10:36.609000 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:36.608962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4b46l_77bed861-9016-43ee-825c-b20388894201/konnectivity-agent/0.log" Apr 21 15:10:36.702852 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:36.702825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-110.ec2.internal_1c841ec6859af6aa2f42f3eb0143c101/haproxy/0.log" Apr 21 15:10:40.919361 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:40.919329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/extract/0.log" Apr 21 15:10:40.956056 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:40.956028 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/util/0.log" Apr 21 15:10:40.988086 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:40.988059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zjzfc_7f9a4b74-e442-4289-be96-61458dae1349/pull/0.log" Apr 21 15:10:41.042088 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.042062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/extract/0.log" Apr 21 15:10:41.071122 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.071078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/util/0.log" Apr 21 15:10:41.111730 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.111703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kzpml_a08111d7-e466-49e6-82d9-594c43fba03b/pull/0.log" Apr 21 15:10:41.153305 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.153280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/extract/0.log" Apr 21 15:10:41.196257 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.196174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/util/0.log" Apr 21 15:10:41.225090 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.225058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tzgv8_825448d9-2c0b-4843-9253-284da6d74283/pull/0.log" Apr 21 15:10:41.260834 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.260800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/extract/0.log" Apr 21 15:10:41.292913 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.292887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/util/0.log" Apr 21 15:10:41.320474 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.320445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12f8gx_271e3623-0df8-44e8-877d-a74357dfa3cc/pull/0.log" Apr 21 15:10:41.919603 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:41.919567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-4rx8q_06d6177d-f05c-456f-a1ea-0990ce113035/manager/0.log" Apr 21 15:10:43.780743 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:43.780708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcqk5_88e9e88c-e0de-4c90-9d5f-510016098205/kube-state-metrics/0.log" Apr 21 15:10:43.809598 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:43.809571 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcqk5_88e9e88c-e0de-4c90-9d5f-510016098205/kube-rbac-proxy-main/0.log" Apr 21 15:10:43.836729 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:43.836702 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gcqk5_88e9e88c-e0de-4c90-9d5f-510016098205/kube-rbac-proxy-self/0.log" Apr 21 15:10:44.104376 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.104298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg4jg_6d7bd865-ba9b-41eb-88f9-f4701a3c1880/node-exporter/0.log" Apr 21 15:10:44.132407 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.132379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg4jg_6d7bd865-ba9b-41eb-88f9-f4701a3c1880/kube-rbac-proxy/0.log" Apr 21 15:10:44.181838 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.181811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg4jg_6d7bd865-ba9b-41eb-88f9-f4701a3c1880/init-textfile/0.log" Apr 21 15:10:44.379000 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.378919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zzz96_c88915e8-596e-4a44-94d4-04a3d507a949/kube-rbac-proxy-main/0.log" Apr 21 15:10:44.406392 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.406362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zzz96_c88915e8-596e-4a44-94d4-04a3d507a949/kube-rbac-proxy-self/0.log" Apr 21 15:10:44.439084 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:44.439057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zzz96_c88915e8-596e-4a44-94d4-04a3d507a949/openshift-state-metrics/0.log" Apr 21 15:10:45.394532 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.394491 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4"] Apr 21 15:10:45.394969 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.394948 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="783ea719-adc3-43a2-a428-c205db1d5186" containerName="manager" Apr 21 15:10:45.394969 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.394965 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="783ea719-adc3-43a2-a428-c205db1d5186" containerName="manager" Apr 21 15:10:45.395057 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.395044 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="783ea719-adc3-43a2-a428-c205db1d5186" containerName="manager" Apr 21 15:10:45.398268 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.398247 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.400401 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.400379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qvxwt\"/\"openshift-service-ca.crt\"" Apr 21 15:10:45.401013 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.400990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qvxwt\"/\"kube-root-ca.crt\"" Apr 21 15:10:45.401289 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.401273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qvxwt\"/\"default-dockercfg-v77v4\"" Apr 21 15:10:45.408386 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.408362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4"] Apr 21 15:10:45.515329 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.515273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-podres\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.515533 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.515405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-proc\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.515533 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.515429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66zs\" (UniqueName: \"kubernetes.io/projected/98a42440-895f-45e1-806a-2dcf34f85a98-kube-api-access-w66zs\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.515533 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.515465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-sys\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.515533 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.515525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-lib-modules\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616430 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-lib-modules\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616430 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-podres\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-proc\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w66zs\" (UniqueName: \"kubernetes.io/projected/98a42440-895f-45e1-806a-2dcf34f85a98-kube-api-access-w66zs\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-sys\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-podres\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-proc\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-lib-modules\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.616655 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.616606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98a42440-895f-45e1-806a-2dcf34f85a98-sys\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.625999 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.625967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66zs\" (UniqueName: \"kubernetes.io/projected/98a42440-895f-45e1-806a-2dcf34f85a98-kube-api-access-w66zs\") pod \"perf-node-gather-daemonset-5qgf4\" (UID: \"98a42440-895f-45e1-806a-2dcf34f85a98\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.709186 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.709062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:45.836251 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.836224 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4"] Apr 21 15:10:45.838522 ip-10-0-138-110 kubenswrapper[2576]: W0421 15:10:45.838491 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod98a42440_895f_45e1_806a_2dcf34f85a98.slice/crio-d46105f58e0f69ac20df51c8c1c4846ae1c90d7c469ba8a445f67552615a117f WatchSource:0}: Error finding container d46105f58e0f69ac20df51c8c1c4846ae1c90d7c469ba8a445f67552615a117f: Status 404 returned error can't find the container with id d46105f58e0f69ac20df51c8c1c4846ae1c90d7c469ba8a445f67552615a117f Apr 21 15:10:45.840421 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.840403 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:10:45.883747 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:45.883719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" event={"ID":"98a42440-895f-45e1-806a-2dcf34f85a98","Type":"ContainerStarted","Data":"d46105f58e0f69ac20df51c8c1c4846ae1c90d7c469ba8a445f67552615a117f"} Apr 21 15:10:46.889823 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:46.889785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" event={"ID":"98a42440-895f-45e1-806a-2dcf34f85a98","Type":"ContainerStarted","Data":"eb8dd8be2b7f35c12fbee03ee49f71bcab4db32a47c43b749a8938eab0ec74b2"} Apr 21 15:10:46.890219 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:46.889918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:46.920281 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:46.920221 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" podStartSLOduration=1.920203058 podStartE2EDuration="1.920203058s" podCreationTimestamp="2026-04-21 15:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:46.918929279 +0000 UTC m=+2711.584949946" watchObservedRunningTime="2026-04-21 15:10:46.920203058 +0000 UTC m=+2711.586223731" Apr 21 15:10:47.279340 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:47.279264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-b4hc4_b3417a6b-baaf-488b-99aa-0815ee21a3b9/download-server/0.log" Apr 21 15:10:48.667812 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:48.667786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x44hx_91d1fedb-0634-4e6a-97a4-81d44c528d9f/dns/0.log" Apr 21 15:10:48.689317 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:48.689293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x44hx_91d1fedb-0634-4e6a-97a4-81d44c528d9f/kube-rbac-proxy/0.log" Apr 21 15:10:48.717254 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:48.717226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bc2h_fbd47f5e-fbd3-42b6-9631-93eae13c275b/dns-node-resolver/0.log" Apr 21 15:10:49.409488 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:49.409458 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwm4s_a9148116-ff1f-4dd1-b372-36d40a8132b7/node-ca/0.log" Apr 21 15:10:50.521025 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:50.520987 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2c7jl_a2f05579-a456-46bb-8cd7-e82c3c0a04c2/discovery/0.log" Apr 21 15:10:50.580850 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:50.580820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b77d4bb4f-5d8ls_66bb68fe-aa1c-46cd-b3cb-0afd155918f5/kube-auth-proxy/0.log" Apr 21 15:10:51.349447 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:51.349417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f5mvw_000af909-ad67-432e-8f90-beaf88b66978/serve-healthcheck-canary/0.log" Apr 21 15:10:51.855930 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:51.855902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2shrn_091d8da6-1925-4483-b412-411d3ab3ec20/kube-rbac-proxy/0.log" Apr 21 15:10:51.876959 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:51.876937 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2shrn_091d8da6-1925-4483-b412-411d3ab3ec20/exporter/0.log" Apr 21 15:10:51.898693 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:51.898666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2shrn_091d8da6-1925-4483-b412-411d3ab3ec20/extractor/0.log" Apr 21 15:10:52.902689 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:52.902663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-5qgf4" Apr 21 15:10:54.152396 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:54.152364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ckrj2_bc4e480a-1a65-41e7-85b5-31e172a4dbca/manager/0.log" Apr 21 15:10:54.278504 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:54.278471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c8mtf_9307725b-46ef-4197-880a-edc49567bde2/manager/1.log" Apr 21 15:10:54.289340 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:54.289316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c8mtf_9307725b-46ef-4197-880a-edc49567bde2/manager/2.log" Apr 21 15:10:54.493440 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:54.493395 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7df645bd74-qg9dd_d06ff8e8-a678-4df5-a145-0b0b2b8a94f2/manager/0.log" Apr 21 15:10:56.171612 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:10:56.171581 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-9gxr7_39c76ed2-fd0b-4900-b8cd-f3c36d32de46/openshift-lws-operator/0.log" Apr 21 15:11:02.379897 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.379861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/kube-multus-additional-cni-plugins/0.log" Apr 21 15:11:02.412138 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.412098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/egress-router-binary-copy/0.log" Apr 21 15:11:02.436304 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.436279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/cni-plugins/0.log" Apr 21 15:11:02.465212 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.465188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/bond-cni-plugin/0.log" Apr 21 15:11:02.488465 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.488437 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/routeoverride-cni/0.log" Apr 21 15:11:02.510561 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.510536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/whereabouts-cni-bincopy/0.log" Apr 21 15:11:02.535170 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.535148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9hpmv_8ad4357c-863f-4ad8-b101-d0a7bef5fa90/whereabouts-cni/0.log" Apr 21 15:11:02.787167 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.787138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hgpkp_7144a992-05ff-4c23-81b5-3daad4c438a6/kube-multus/0.log" Apr 21 15:11:02.807364 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.807337 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5bc88_bbebdf2e-6e19-434f-9e3e-bb9880123092/network-metrics-daemon/0.log" Apr 21 15:11:02.827922 ip-10-0-138-110 kubenswrapper[2576]: I0421 15:11:02.827885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5bc88_bbebdf2e-6e19-434f-9e3e-bb9880123092/kube-rbac-proxy/0.log"